sha
stringlengths 40
40
| text
stringlengths 0
13.4M
| id
stringlengths 2
117
| tags
sequence | created_at
stringlengths 25
25
| metadata
stringlengths 2
31.7M
| last_modified
stringlengths 25
25
|
---|---|---|---|---|---|---|
e0bfe54a74e818bf98efc539a898da2ae6e1051f |
# Dataset of m1014/M1014/M1014 (Girls' Frontline)
This is the dataset of m1014/M1014/M1014 (Girls' Frontline), containing 30 images and their tags.
The core tags of this character are `long_hair, bangs, hair_between_eyes, heterochromia, red_eyes, hair_ornament, breasts, yellow_eyes, brown_hair, hat, black_hair, headphones, hairclip, large_breasts, very_long_hair, sidelocks, x_hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 30 | 33.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1014_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 30 | 21.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1014_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 62 | 41.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1014_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 30 | 30.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1014_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 62 | 54.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1014_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/m1014_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, solo, black_gloves, looking_at_viewer, closed_mouth, fingerless_gloves, holding_gun, cleavage, dress, long_sleeves, beret, blush, brown_eyes, shotgun_shell, wide_sleeves, character_name, collarbone, medium_breasts, open_jacket, thigh_strap, white_background, standing |
| 1 | 6 |  |  |  |  |  | 1girl, black_kimono, blush, hair_flower, long_sleeves, obi, solo, wide_sleeves, standing, full_body, gun, looking_at_viewer, sandals, single_hair_bun, tabi, zouri, cleavage, closed_mouth, holding_umbrella, oil-paper_umbrella, open_mouth, print_kimono, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | black_gloves | looking_at_viewer | closed_mouth | fingerless_gloves | holding_gun | cleavage | dress | long_sleeves | beret | blush | brown_eyes | shotgun_shell | wide_sleeves | character_name | collarbone | medium_breasts | open_jacket | thigh_strap | white_background | standing | black_kimono | hair_flower | obi | full_body | gun | sandals | single_hair_bun | tabi | zouri | holding_umbrella | oil-paper_umbrella | open_mouth | print_kimono | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------------------|:---------------|:--------------------|:--------------|:-----------|:--------|:---------------|:--------|:--------|:-------------|:----------------|:---------------|:-----------------|:-------------|:-----------------|:--------------|:--------------|:-------------------|:-----------|:---------------|:--------------|:------|:------------|:------|:----------|:------------------|:-------|:--------|:-------------------|:---------------------|:-------------|:---------------|:--------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | | X | X | | | X | | X | | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/m1014_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:18:11+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:23:59+00:00 |
43235b8322f7b0494f42c8f4889d0d27807c7148 |
# Dataset of m1919a4/M1919A4/M1919A4 (Girls' Frontline)
This is the dataset of m1919a4/M1919A4/M1919A4 (Girls' Frontline), containing 28 images and their tags.
The core tags of this character are `blonde_hair, long_hair, hair_ornament, red_eyes, breasts, bangs, hairclip, small_breasts, pointy_ears`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 28 | 26.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1919a4_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 28 | 20.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1919a4_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 60 | 37.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1919a4_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 28 | 25.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1919a4_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 60 | 45.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1919a4_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/m1919a4_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, looking_at_viewer, open_mouth, solo, navel, blush, barefoot, cape, full_body, hair_bow, vampire, wrist_cuffs, fangs, skull_hair_ornament, fingernails, nipples, red_bow, simple_background, smile, toenail_polish, white_background |
| 1 | 5 |  |  |  |  |  | 1girl, machine_gun, solo, long_sleeves, looking_at_viewer, smile, white_pantyhose, ammunition_belt, brown_headwear, brown_jacket, bullet, closed_mouth, garrison_cap, holding_gun, open_jacket, open_mouth, pink_eyes, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | open_mouth | solo | navel | blush | barefoot | cape | full_body | hair_bow | vampire | wrist_cuffs | fangs | skull_hair_ornament | fingernails | nipples | red_bow | simple_background | smile | toenail_polish | white_background | machine_gun | long_sleeves | white_pantyhose | ammunition_belt | brown_headwear | brown_jacket | bullet | closed_mouth | garrison_cap | holding_gun | open_jacket | pink_eyes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------------|:-------|:--------|:--------|:-----------|:-------|:------------|:-----------|:----------|:--------------|:--------|:----------------------|:--------------|:----------|:----------|:--------------------|:--------|:-----------------|:-------------------|:--------------|:---------------|:------------------|:------------------|:-----------------|:---------------|:---------|:---------------|:---------------|:--------------|:--------------|:------------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | | | | | | | | | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/m1919a4_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:18:12+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:23:22+00:00 |
28d3d4f7c157eb6e36755f7e1ce86e333ae1e00e |
# Dataset of g43/G43/G43 (Girls' Frontline)
This is the dataset of g43/G43/G43 (Girls' Frontline), containing 11 images and their tags.
The core tags of this character are `blonde_hair, hat, blue_eyes, black_headwear, braid, short_hair, military_hat, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 11.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/g43_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 7.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/g43_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 22 | 14.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/g43_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 10.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/g43_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 22 | 21.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/g43_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/g43_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, solo, smile, looking_at_viewer, rifle, simple_background, white_background, closed_mouth, holding_gun, military_uniform, thighhighs, gloves, jewelry |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | looking_at_viewer | rifle | simple_background | white_background | closed_mouth | holding_gun | military_uniform | thighhighs | gloves | jewelry |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:--------|:--------------------|:-------------------|:---------------|:--------------|:-------------------|:-------------|:---------|:----------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/g43_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:18:15+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:21:36+00:00 |
9936d437f7eb4a33cfa36237d58b89397d18c5ed |
# Dataset of caws/CAWS/CAWS (Girls' Frontline)
This is the dataset of caws/CAWS/CAWS (Girls' Frontline), containing 19 images and their tags.
The core tags of this character are `black_hair, yellow_eyes, bangs, hair_bun, breasts, braid, medium_breasts, short_hair, eyeshadow, goggles_on_head, side_braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 21.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caws_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 13.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caws_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 30 | 22.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caws_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 19.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caws_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 30 | 31.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caws_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/caws_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, solo, goggles, makeup, blush, gloves, jacket, looking_at_viewer, holding_gun, hood_down, long_sleeves, open_clothes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | goggles | makeup | blush | gloves | jacket | looking_at_viewer | holding_gun | hood_down | long_sleeves | open_clothes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------|:---------|:--------|:---------|:---------|:--------------------|:--------------|:------------|:---------------|:---------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/caws_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:18:17+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:21:44+00:00 |
5df819082bacd017626b6624d5803cdeb827c4d5 | senhorsapo/NickNelson | [
"license:openrail",
"region:us"
] | 2024-01-14T01:18:22+00:00 | {"license": "openrail"} | 2024-01-14T01:18:33+00:00 |
|
1146ba4936e4f7619c1a0e52c6ab4b7277298156 |
# Dataset of ak_74u/AK-74U/AK-74U (Girls' Frontline)
This is the dataset of ak_74u/AK-74U/AK-74U (Girls' Frontline), containing 11 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, breasts, bangs, hair_between_eyes, long_hair, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 15.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ak_74u_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 9.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ak_74u_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 26 | 18.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ak_74u_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 13.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ak_74u_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 26 | 26.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ak_74u_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ak_74u_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, solo, looking_at_viewer, choker, cleavage, assault_rifle, black_jacket, white_background, fingerless_gloves, holding_gun, open_jacket, shorts, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | choker | cleavage | assault_rifle | black_jacket | white_background | fingerless_gloves | holding_gun | open_jacket | shorts | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:---------|:-----------|:----------------|:---------------|:-------------------|:--------------------|:--------------|:--------------|:---------|:--------------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ak_74u_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:18:24+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:21:22+00:00 |
7b48fc3f35a7df8086e0c71128d15809e3503cd6 |
# Dataset of fg42/FG42/FG42 (Girls' Frontline)
This is the dataset of fg42/FG42/FG42 (Girls' Frontline), containing 10 images and their tags.
The core tags of this character are `blonde_hair, hat, blue_eyes, bangs, garrison_cap, long_hair, medium_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 18.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fg42_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 7.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fg42_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 23 | 15.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fg42_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 14.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fg42_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 23 | 27.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fg42_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fg42_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, looking_at_viewer, white_gloves, black_pantyhose, blue_skirt, holding_gun, rifle, simple_background, standing, uniform, white_background, white_shirt, belt, black_necktie, blush, closed_mouth, full_body, pouch, short_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | white_gloves | black_pantyhose | blue_skirt | holding_gun | rifle | simple_background | standing | uniform | white_background | white_shirt | belt | black_necktie | blush | closed_mouth | full_body | pouch | short_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:---------------|:------------------|:-------------|:--------------|:--------|:--------------------|:-----------|:----------|:-------------------|:--------------|:-------|:----------------|:--------|:---------------|:------------|:--------|:----------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/fg42_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:18:46+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:22:11+00:00 |
5ee5175c4949278c9a061904ffb3f23223e3abf8 |
# Dataset of px4_storm/Px4ストーム/Px4风暴 (Girls' Frontline)
This is the dataset of px4_storm/Px4ストーム/Px4风暴 (Girls' Frontline), containing 31 images and their tags.
The core tags of this character are `green_eyes, blonde_hair, breasts, bangs, large_breasts, mole_under_eye, mole, short_hair, hair_between_eyes, medium_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 31 | 40.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/px4_storm_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 31 | 22.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/px4_storm_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 80 | 48.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/px4_storm_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 31 | 35.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/px4_storm_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 80 | 68.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/px4_storm_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/px4_storm_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, gloves, solo, hood_up, blush, looking_at_viewer, white_background, dress, character_name, handgun, black_coat, holding_gun, skindentation, thigh_strap, thighs |
| 1 | 5 |  |  |  |  |  | 1girl, bare_shoulders, blush, looking_at_viewer, navel, solo, white_bikini, cleavage, collarbone, hairclip, halterneck, simple_background, white_background, bare_legs, closed_mouth, feet, full_body, holding, o-ring_bikini, orange_hair, parted_lips, sandals, sarong, sky, smile, standing, stomach, thighs, toes, wet, white_footwear |
| 2 | 7 |  |  |  |  |  | 1girl, blush, red_sweater, smile, looking_at_viewer, solo, turtleneck, black_pantyhose, beret, earrings, necklace, panties, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | gloves | solo | hood_up | blush | looking_at_viewer | white_background | dress | character_name | handgun | black_coat | holding_gun | skindentation | thigh_strap | thighs | bare_shoulders | navel | white_bikini | cleavage | collarbone | hairclip | halterneck | simple_background | bare_legs | closed_mouth | feet | full_body | holding | o-ring_bikini | orange_hair | parted_lips | sandals | sarong | sky | smile | standing | stomach | toes | wet | white_footwear | red_sweater | turtleneck | black_pantyhose | beret | earrings | necklace | panties |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------|:----------|:--------|:--------------------|:-------------------|:--------|:-----------------|:----------|:-------------|:--------------|:----------------|:--------------|:---------|:-----------------|:--------|:---------------|:-----------|:-------------|:-----------|:-------------|:--------------------|:------------|:---------------|:-------|:------------|:----------|:----------------|:--------------|:--------------|:----------|:---------|:------|:--------|:-----------|:----------|:-------|:------|:-----------------|:--------------|:-------------|:------------------|:--------|:-----------|:-----------|:----------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | X | | X | X | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X |
| CyberHarem/px4_storm_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:19:17+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:26:35+00:00 |
899e64a3394f1585dbf8f838827af1296f885776 | StinkRat239577/riddles | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T01:19:58+00:00 | {"license": "apache-2.0"} | 2024-01-14T01:20:11+00:00 |
|
6d00e03062dfdc97687c5101930c1c3286684e18 | fsdfdsffd/dscxczxc | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T01:25:11+00:00 | {"license": "apache-2.0"} | 2024-01-14T01:26:33+00:00 |
|
98556e909e16623c2329b4bcd2ad6d3bbf0415a2 |
# Dataset Card for Evaluation run of macadeliccc/polyglot-math-4x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/polyglot-math-4x7b](https://huggingface.co/macadeliccc/polyglot-math-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__polyglot-math-4x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T01:25:55.830403](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__polyglot-math-4x7b/blob/main/results_2024-01-14T01-25-55.830403.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6367747877161951,
"acc_stderr": 0.03232816338890694,
"acc_norm": 0.6393383626953215,
"acc_norm_stderr": 0.03297276004070419,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5378477391082209,
"mc2_stderr": 0.015247687104643274
},
"harness|arc:challenge|25": {
"acc": 0.6023890784982935,
"acc_stderr": 0.014301752223279542,
"acc_norm": 0.6373720136518771,
"acc_norm_stderr": 0.014049106564955009
},
"harness|hellaswag|10": {
"acc": 0.6549492133041227,
"acc_stderr": 0.004744132825391518,
"acc_norm": 0.8485361481776539,
"acc_norm_stderr": 0.0035776774950640874
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724053,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724053
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462456,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462456
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851105,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851105
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612896,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612896
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906944,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906944
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809784,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809784
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508283,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39664804469273746,
"acc_stderr": 0.01636135476982247,
"acc_norm": 0.39664804469273746,
"acc_norm_stderr": 0.01636135476982247
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.02916312857067073,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.02916312857067073
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.01909422816700032,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.01909422816700032
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5378477391082209,
"mc2_stderr": 0.015247687104643274
},
"harness|winogrande|5": {
"acc": 0.7845303867403315,
"acc_stderr": 0.011555295286059282
},
"harness|gsm8k|5": {
"acc": 0.5663381349507203,
"acc_stderr": 0.013650728047064695
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_macadeliccc__polyglot-math-4x7b | [
"region:us"
] | 2024-01-14T01:28:14+00:00 | {"pretty_name": "Evaluation run of macadeliccc/polyglot-math-4x7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [macadeliccc/polyglot-math-4x7b](https://huggingface.co/macadeliccc/polyglot-math-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__polyglot-math-4x7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T01:25:55.830403](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__polyglot-math-4x7b/blob/main/results_2024-01-14T01-25-55.830403.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6367747877161951,\n \"acc_stderr\": 0.03232816338890694,\n \"acc_norm\": 0.6393383626953215,\n \"acc_norm_stderr\": 0.03297276004070419,\n \"mc1\": 0.3635250917992656,\n \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5378477391082209,\n \"mc2_stderr\": 0.015247687104643274\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6023890784982935,\n \"acc_stderr\": 0.014301752223279542,\n \"acc_norm\": 0.6373720136518771,\n \"acc_norm_stderr\": 0.014049106564955009\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6549492133041227,\n \"acc_stderr\": 0.004744132825391518,\n \"acc_norm\": 0.8485361481776539,\n \"acc_norm_stderr\": 0.0035776774950640874\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998905,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998905\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724053,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724053\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851105,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851105\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612896,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612896\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906944,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906944\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809784,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809784\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508283,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39664804469273746,\n \"acc_stderr\": 0.01636135476982247,\n \"acc_norm\": 0.39664804469273746,\n \"acc_norm_stderr\": 0.01636135476982247\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.02916312857067073,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.02916312857067073\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700032,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700032\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5378477391082209,\n \"mc2_stderr\": 0.015247687104643274\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7845303867403315,\n \"acc_stderr\": 0.011555295286059282\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5663381349507203,\n \"acc_stderr\": 0.013650728047064695\n }\n}\n```", "repo_url": "https://huggingface.co/macadeliccc/polyglot-math-4x7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|winogrande|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["results_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T01-25-55.830403.parquet"]}]}]} | 2024-01-14T01:28:36+00:00 |
2e764878615b4794eccf251afc673c203faed865 | jpqueiroz335/cw | [
"license:openrail",
"region:us"
] | 2024-01-14T01:28:45+00:00 | {"license": "openrail"} | 2024-01-14T01:28:45+00:00 |
|
669cb61386c31208043498a84c1665a5595ef61a |
# Dataset Card for Evaluation run of macadeliccc/laser-polyglot-4x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/laser-polyglot-4x7b](https://huggingface.co/macadeliccc/laser-polyglot-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__laser-polyglot-4x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T01:28:04.517036](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-polyglot-4x7b/blob/main/results_2024-01-14T01-28-04.517036.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6383969687290681,
"acc_stderr": 0.032222378716622334,
"acc_norm": 0.6424348983154926,
"acc_norm_stderr": 0.03285947296719794,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5546852358397387,
"mc2_stderr": 0.015162772354647294
},
"harness|arc:challenge|25": {
"acc": 0.6092150170648464,
"acc_stderr": 0.01425856388051378,
"acc_norm": 0.6416382252559727,
"acc_norm_stderr": 0.014012883334859857
},
"harness|hellaswag|10": {
"acc": 0.6581358295160327,
"acc_stderr": 0.0047336492748145075,
"acc_norm": 0.8498307110137423,
"acc_norm_stderr": 0.0035650718701954478
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268552,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268552
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478926,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478926
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.01633288239343135,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.01633288239343135
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.0133064782430663,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.0133064782430663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265016,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265016
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36312849162011174,
"acc_stderr": 0.0160837499868537,
"acc_norm": 0.36312849162011174,
"acc_norm_stderr": 0.0160837499868537
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4621903520208605,
"acc_stderr": 0.012733671880342506,
"acc_norm": 0.4621903520208605,
"acc_norm_stderr": 0.012733671880342506
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824876,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824876
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786855,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786855
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5546852358397387,
"mc2_stderr": 0.015162772354647294
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497811
},
"harness|gsm8k|5": {
"acc": 0.4844579226686884,
"acc_stderr": 0.013765829454512891
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_macadeliccc__laser-polyglot-4x7b | [
"region:us"
] | 2024-01-14T01:30:23+00:00 | {"pretty_name": "Evaluation run of macadeliccc/laser-polyglot-4x7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [macadeliccc/laser-polyglot-4x7b](https://huggingface.co/macadeliccc/laser-polyglot-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__laser-polyglot-4x7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T01:28:04.517036](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-polyglot-4x7b/blob/main/results_2024-01-14T01-28-04.517036.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6383969687290681,\n \"acc_stderr\": 0.032222378716622334,\n \"acc_norm\": 0.6424348983154926,\n \"acc_norm_stderr\": 0.03285947296719794,\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5546852358397387,\n \"mc2_stderr\": 0.015162772354647294\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6092150170648464,\n \"acc_stderr\": 0.01425856388051378,\n \"acc_norm\": 0.6416382252559727,\n \"acc_norm_stderr\": 0.014012883334859857\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6581358295160327,\n \"acc_stderr\": 0.0047336492748145075,\n \"acc_norm\": 0.8498307110137423,\n \"acc_norm_stderr\": 0.0035650718701954478\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268552,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268552\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478926,\n \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478926\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.01633288239343135,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.01633288239343135\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.0133064782430663,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.0133064782430663\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265016,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265016\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n \"acc_stderr\": 0.0160837499868537,\n \"acc_norm\": 0.36312849162011174,\n \"acc_norm_stderr\": 0.0160837499868537\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n \"acc_stderr\": 0.012733671880342506,\n \"acc_norm\": 0.4621903520208605,\n \"acc_norm_stderr\": 0.012733671880342506\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824876,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824876\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786855,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786855\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5546852358397387,\n \"mc2_stderr\": 0.015162772354647294\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497811\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4844579226686884,\n \"acc_stderr\": 0.013765829454512891\n }\n}\n```", "repo_url": "https://huggingface.co/macadeliccc/laser-polyglot-4x7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|winogrande|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["results_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T01-28-04.517036.parquet"]}]}]} | 2024-01-14T01:30:47+00:00 |
bbd807c2c501c8e81ffa04501900e1c52fd8484b | SToons/Sekoutoons | [
"region:us"
] | 2024-01-14T01:35:25+00:00 | {} | 2024-01-14T01:35:25+00:00 |
|
35eb06dbe35a46e48b66863c33a79ad48e59b9a4 | Minata/cot_mistral_method2test_v0 | [
"region:us"
] | 2024-01-14T01:48:41+00:00 | {"dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 56984, "num_examples": 34}], "download_size": 19942, "dataset_size": 56984}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T02:34:56+00:00 |
|
0cb5eb4889e41e9ababd0af70133854c08640080 | AshtonLKY/augmented_audio | [
"region:us"
] | 2024-01-14T01:52:43+00:00 | {"dataset_info": {"features": [{"name": "audio", "dtype": "audio"}, {"name": "transcript", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 9848101426.557, "num_examples": 67189}], "download_size": 10223714933, "dataset_size": 9848101426.557}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T02:09:03+00:00 |
|
c41e48261b3ebdad89576cba02e010cd839525fe | RuyYoshida/narradorestv | [
"license:openrail",
"region:us"
] | 2024-01-14T01:54:43+00:00 | {"license": "openrail"} | 2024-01-14T01:55:32+00:00 |
|
759795862113fce0d53c5a20825bd5e51d8ce2c3 |
# Dataset Card for Evaluation run of Weyaxi/Bagel-Hermes-34B-Slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Bagel-Hermes-34B-Slerp](https://huggingface.co/Weyaxi/Bagel-Hermes-34B-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-34B-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T01:56:18.562449](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-34B-Slerp/blob/main/results_2024-01-14T01-56-18.562449.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7687638749469244,
"acc_stderr": 0.02791668972955577,
"acc_norm": 0.7731851983230489,
"acc_norm_stderr": 0.028441222412067358,
"mc1": 0.4969400244798042,
"mc1_stderr": 0.01750317326096062,
"mc2": 0.6709148255495884,
"mc2_stderr": 0.014645409374455808
},
"harness|arc:challenge|25": {
"acc": 0.6706484641638225,
"acc_stderr": 0.013734057652635474,
"acc_norm": 0.7073378839590444,
"acc_norm_stderr": 0.013295916103619422
},
"harness|hellaswag|10": {
"acc": 0.6638119896434973,
"acc_stderr": 0.004714386376337134,
"acc_norm": 0.8568014339772954,
"acc_norm_stderr": 0.0034955936625207526
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.02629399585547494,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.02629399585547494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8188679245283019,
"acc_stderr": 0.023702963526757798,
"acc_norm": 0.8188679245283019,
"acc_norm_stderr": 0.023702963526757798
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.023112508176051236,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.023112508176051236
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.026355158413349414,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.026355158413349414
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7724137931034483,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.7724137931034483,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6931216931216931,
"acc_stderr": 0.02375292871211213,
"acc_norm": 0.6931216931216931,
"acc_norm_stderr": 0.02375292871211213
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9096774193548387,
"acc_stderr": 0.016306570644488313,
"acc_norm": 0.9096774193548387,
"acc_norm_stderr": 0.016306570644488313
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6502463054187192,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.6502463054187192,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.018263105420199505,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.018263105420199505
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695318,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.01889552448260495,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.01889552448260495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.030182099804387262,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.030182099804387262
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8487394957983193,
"acc_stderr": 0.023274255898707946,
"acc_norm": 0.8487394957983193,
"acc_norm_stderr": 0.023274255898707946
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5364238410596026,
"acc_stderr": 0.04071636065944217,
"acc_norm": 0.5364238410596026,
"acc_norm_stderr": 0.04071636065944217
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9192660550458716,
"acc_stderr": 0.011680172292862088,
"acc_norm": 0.9192660550458716,
"acc_norm_stderr": 0.011680172292862088
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280226,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280226
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597446,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597446
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.027285246312758957,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.027285246312758957
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563275,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.02632138319878367,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.02632138319878367
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6339285714285714,
"acc_stderr": 0.04572372358737431,
"acc_norm": 0.6339285714285714,
"acc_norm_stderr": 0.04572372358737431
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331366,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331366
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253876,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253876
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9067688378033205,
"acc_stderr": 0.010397417087292849,
"acc_norm": 0.9067688378033205,
"acc_norm_stderr": 0.010397417087292849
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8179190751445087,
"acc_stderr": 0.02077676110251298,
"acc_norm": 0.8179190751445087,
"acc_norm_stderr": 0.02077676110251298
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.794413407821229,
"acc_stderr": 0.013516116210724202,
"acc_norm": 0.794413407821229,
"acc_norm_stderr": 0.013516116210724202
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8366013071895425,
"acc_stderr": 0.021170623011213505,
"acc_norm": 0.8366013071895425,
"acc_norm_stderr": 0.021170623011213505
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8295819935691319,
"acc_stderr": 0.021355343028264053,
"acc_norm": 0.8295819935691319,
"acc_norm_stderr": 0.021355343028264053
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8827160493827161,
"acc_stderr": 0.017903112615281123,
"acc_norm": 0.8827160493827161,
"acc_norm_stderr": 0.017903112615281123
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.648936170212766,
"acc_stderr": 0.028473501272963758,
"acc_norm": 0.648936170212766,
"acc_norm_stderr": 0.028473501272963758
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6029986962190352,
"acc_stderr": 0.012496346982909556,
"acc_norm": 0.6029986962190352,
"acc_norm_stderr": 0.012496346982909556
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8419117647058824,
"acc_stderr": 0.022161462608068522,
"acc_norm": 0.8419117647058824,
"acc_norm_stderr": 0.022161462608068522
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.015643069911273344,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.015643069911273344
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8448979591836735,
"acc_stderr": 0.0231747988612186,
"acc_norm": 0.8448979591836735,
"acc_norm_stderr": 0.0231747988612186
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646613,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646613
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4969400244798042,
"mc1_stderr": 0.01750317326096062,
"mc2": 0.6709148255495884,
"mc2_stderr": 0.014645409374455808
},
"harness|winogrande|5": {
"acc": 0.8437253354380426,
"acc_stderr": 0.010205351791873492
},
"harness|gsm8k|5": {
"acc": 0.6626231993934799,
"acc_stderr": 0.013023665136222096
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-34B-Slerp | [
"region:us"
] | 2024-01-14T01:58:30+00:00 | {"pretty_name": "Evaluation run of Weyaxi/Bagel-Hermes-34B-Slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Bagel-Hermes-34B-Slerp](https://huggingface.co/Weyaxi/Bagel-Hermes-34B-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-34B-Slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T01:56:18.562449](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-34B-Slerp/blob/main/results_2024-01-14T01-56-18.562449.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7687638749469244,\n \"acc_stderr\": 0.02791668972955577,\n \"acc_norm\": 0.7731851983230489,\n \"acc_norm_stderr\": 0.028441222412067358,\n \"mc1\": 0.4969400244798042,\n \"mc1_stderr\": 0.01750317326096062,\n \"mc2\": 0.6709148255495884,\n \"mc2_stderr\": 0.014645409374455808\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6706484641638225,\n \"acc_stderr\": 0.013734057652635474,\n \"acc_norm\": 0.7073378839590444,\n \"acc_norm_stderr\": 0.013295916103619422\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6638119896434973,\n \"acc_stderr\": 0.004714386376337134,\n \"acc_norm\": 0.8568014339772954,\n \"acc_norm_stderr\": 0.0034955936625207526\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.02629399585547494,\n \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.02629399585547494\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8188679245283019,\n \"acc_stderr\": 0.023702963526757798,\n \"acc_norm\": 0.8188679245283019,\n \"acc_norm_stderr\": 0.023702963526757798\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.023112508176051236,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.023112508176051236\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.049598599663841815,\n \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.049598599663841815\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.026355158413349414,\n \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.026355158413349414\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7724137931034483,\n \"acc_stderr\": 0.03493950380131184,\n \"acc_norm\": 0.7724137931034483,\n \"acc_norm_stderr\": 0.03493950380131184\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6931216931216931,\n \"acc_stderr\": 0.02375292871211213,\n \"acc_norm\": 0.6931216931216931,\n \"acc_norm_stderr\": 0.02375292871211213\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9096774193548387,\n \"acc_stderr\": 0.016306570644488313,\n \"acc_norm\": 0.9096774193548387,\n \"acc_norm_stderr\": 0.016306570644488313\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6502463054187192,\n \"acc_stderr\": 0.03355400904969566,\n \"acc_norm\": 0.6502463054187192,\n \"acc_norm_stderr\": 0.03355400904969566\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.018263105420199505,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.018263105420199505\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.01889552448260495,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.01889552448260495\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.030182099804387262,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.030182099804387262\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.023274255898707946,\n \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.023274255898707946\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5364238410596026,\n \"acc_stderr\": 0.04071636065944217,\n \"acc_norm\": 0.5364238410596026,\n \"acc_norm_stderr\": 0.04071636065944217\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9192660550458716,\n \"acc_stderr\": 0.011680172292862088,\n \"acc_norm\": 0.9192660550458716,\n \"acc_norm_stderr\": 0.011680172292862088\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.032468872436376486,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.032468872436376486\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073322,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073322\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n \"acc_stderr\": 0.02693611191280226,\n \"acc_norm\": 0.7982062780269058,\n \"acc_norm_stderr\": 0.02693611191280226\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597446,\n \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597446\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.02923927267563275,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.02923927267563275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.02632138319878367,\n \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.02632138319878367\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n \"acc_stderr\": 0.04572372358737431,\n \"acc_norm\": 0.6339285714285714,\n \"acc_norm_stderr\": 0.04572372358737431\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331366,\n \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331366\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253876,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253876\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9067688378033205,\n \"acc_stderr\": 0.010397417087292849,\n \"acc_norm\": 0.9067688378033205,\n \"acc_norm_stderr\": 0.010397417087292849\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8179190751445087,\n \"acc_stderr\": 0.02077676110251298,\n \"acc_norm\": 0.8179190751445087,\n \"acc_norm_stderr\": 0.02077676110251298\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.794413407821229,\n \"acc_stderr\": 0.013516116210724202,\n \"acc_norm\": 0.794413407821229,\n \"acc_norm_stderr\": 0.013516116210724202\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8366013071895425,\n \"acc_stderr\": 0.021170623011213505,\n \"acc_norm\": 0.8366013071895425,\n \"acc_norm_stderr\": 0.021170623011213505\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8295819935691319,\n \"acc_stderr\": 0.021355343028264053,\n \"acc_norm\": 0.8295819935691319,\n \"acc_norm_stderr\": 0.021355343028264053\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8827160493827161,\n \"acc_stderr\": 0.017903112615281123,\n \"acc_norm\": 0.8827160493827161,\n \"acc_norm_stderr\": 0.017903112615281123\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.648936170212766,\n \"acc_stderr\": 0.028473501272963758,\n \"acc_norm\": 0.648936170212766,\n \"acc_norm_stderr\": 0.028473501272963758\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6029986962190352,\n \"acc_stderr\": 0.012496346982909556,\n \"acc_norm\": 0.6029986962190352,\n \"acc_norm_stderr\": 0.012496346982909556\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8419117647058824,\n \"acc_stderr\": 0.022161462608068522,\n \"acc_norm\": 0.8419117647058824,\n \"acc_norm_stderr\": 0.022161462608068522\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.015643069911273344,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.015643069911273344\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8448979591836735,\n \"acc_stderr\": 0.0231747988612186,\n \"acc_norm\": 0.8448979591836735,\n \"acc_norm_stderr\": 0.0231747988612186\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646613,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646613\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4969400244798042,\n \"mc1_stderr\": 0.01750317326096062,\n \"mc2\": 0.6709148255495884,\n \"mc2_stderr\": 0.014645409374455808\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873492\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6626231993934799,\n \"acc_stderr\": 0.013023665136222096\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Bagel-Hermes-34B-Slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|winogrande|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["results_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T01-56-18.562449.parquet"]}]}]} | 2024-01-14T01:58:52+00:00 |
0ed116fa8a6676b7d581035fe3928d625442ae6e |
# Dataset Card for Evaluation run of macadeliccc/laser-dolphin-mixtral-4x7b-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/laser-dolphin-mixtral-4x7b-dpo](https://huggingface.co/macadeliccc/laser-dolphin-mixtral-4x7b-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-4x7b-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T01:56:15.562894](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-4x7b-dpo/blob/main/results_2024-01-14T01-56-15.562894.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6304823287754658,
"acc_stderr": 0.03239962883986832,
"acc_norm": 0.6345924216801483,
"acc_norm_stderr": 0.033044077680253386,
"mc1": 0.4589963280293758,
"mc1_stderr": 0.017444544447661192,
"mc2": 0.6377296280073737,
"mc2_stderr": 0.015266761289957081
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111728,
"acc_norm": 0.6493174061433447,
"acc_norm_stderr": 0.013944635930726096
},
"harness|hellaswag|10": {
"acc": 0.6742680740888269,
"acc_stderr": 0.004676898861978916,
"acc_norm": 0.8580959968133838,
"acc_norm_stderr": 0.003482384956632782
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094767,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094767
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431353,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431353
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579823,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579823
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187306,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187306
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4576271186440678,
"acc_stderr": 0.012724296550980188,
"acc_norm": 0.4576271186440678,
"acc_norm_stderr": 0.012724296550980188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.029227192460032025,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.029227192460032025
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.01939305840235544,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.01939305840235544
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368032,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368032
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4589963280293758,
"mc1_stderr": 0.017444544447661192,
"mc2": 0.6377296280073737,
"mc2_stderr": 0.015266761289957081
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497813
},
"harness|gsm8k|5": {
"acc": 0.4488248673237301,
"acc_stderr": 0.01370015744278808
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-4x7b-dpo | [
"region:us"
] | 2024-01-14T01:58:34+00:00 | {"pretty_name": "Evaluation run of macadeliccc/laser-dolphin-mixtral-4x7b-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [macadeliccc/laser-dolphin-mixtral-4x7b-dpo](https://huggingface.co/macadeliccc/laser-dolphin-mixtral-4x7b-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-4x7b-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T01:56:15.562894](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-4x7b-dpo/blob/main/results_2024-01-14T01-56-15.562894.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6304823287754658,\n \"acc_stderr\": 0.03239962883986832,\n \"acc_norm\": 0.6345924216801483,\n \"acc_norm_stderr\": 0.033044077680253386,\n \"mc1\": 0.4589963280293758,\n \"mc1_stderr\": 0.017444544447661192,\n \"mc2\": 0.6377296280073737,\n \"mc2_stderr\": 0.015266761289957081\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111728,\n \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726096\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6742680740888269,\n \"acc_stderr\": 0.004676898861978916,\n \"acc_norm\": 0.8580959968133838,\n \"acc_norm_stderr\": 0.003482384956632782\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094767,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094767\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431353,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431353\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579823,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579823\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n \"acc_stderr\": 0.015748421208187306,\n \"acc_norm\": 0.3318435754189944,\n \"acc_norm_stderr\": 0.015748421208187306\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4576271186440678,\n \"acc_stderr\": 0.012724296550980188,\n \"acc_norm\": 0.4576271186440678,\n \"acc_norm_stderr\": 0.012724296550980188\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.029227192460032025,\n \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.029227192460032025\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6421568627450981,\n \"acc_stderr\": 0.01939305840235544,\n \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.01939305840235544\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368032,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368032\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4589963280293758,\n \"mc1_stderr\": 0.017444544447661192,\n \"mc2\": 0.6377296280073737,\n \"mc2_stderr\": 0.015266761289957081\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497813\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4488248673237301,\n \"acc_stderr\": 0.01370015744278808\n }\n}\n```", "repo_url": "https://huggingface.co/macadeliccc/laser-dolphin-mixtral-4x7b-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|winogrande|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["results_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T01-56-15.562894.parquet"]}]}]} | 2024-01-14T01:58:57+00:00 |
7b0f6210085a22fc0fdc62b7eca34ad8c219060e | jpqueiroz335/charleswebb | [
"license:openrail",
"region:us"
] | 2024-01-14T02:02:17+00:00 | {"license": "openrail"} | 2024-01-14T02:04:20+00:00 |
|
445b0db15ad7252f26c4d50738e9f6d24f127013 | modelloosrvcc/Wukong_Mico | [
"license:openrail",
"region:us"
] | 2024-01-14T02:05:39+00:00 | {"license": "openrail"} | 2024-01-14T03:09:57+00:00 |
|
95e0259118327b4bda5d80b334f3b1cd8e52e3a2 |
# Dataset of clemenceau/クレマンソー/克莱蒙梭 (Azur Lane)
This is the dataset of clemenceau/クレマンソー/克莱蒙梭 (Azur Lane), containing 43 images and their tags.
The core tags of this character are `long_hair, breasts, large_breasts, pink_hair, red_eyes, crown, bangs, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 43 | 86.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clemenceau_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 43 | 40.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clemenceau_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 108 | 87.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clemenceau_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 43 | 70.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clemenceau_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 108 | 138.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clemenceau_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/clemenceau_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, solo, white_gloves, black_skirt, looking_at_viewer, ponytail, visor_cap, cleavage, bare_shoulders, outdoors, thighs, brown_hair, holding, miniskirt, pencil_skirt, clothing_cutout, crop_top, earrings, sky, belt, cloud, golf_club, sleeveless_shirt |
| 1 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, solo, black_gloves, black_dress, elbow_gloves, cleavage, fur_trim, jewelry, holding, smile, cape, simple_background, cross |
| 2 | 5 |  |  |  |  |  | 1girl, black_gloves, looking_at_viewer, solo, black_dress, cape, fur_trim, hair_between_eyes, long_dress, pink_eyes, standing, braid, cleavage, closed_mouth, holding_staff, signature, simple_background, smile, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | white_gloves | black_skirt | looking_at_viewer | ponytail | visor_cap | cleavage | bare_shoulders | outdoors | thighs | brown_hair | holding | miniskirt | pencil_skirt | clothing_cutout | crop_top | earrings | sky | belt | cloud | golf_club | sleeveless_shirt | black_gloves | black_dress | elbow_gloves | fur_trim | jewelry | smile | cape | simple_background | cross | hair_between_eyes | long_dress | pink_eyes | standing | braid | closed_mouth | holding_staff | signature | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------------|:--------------------|:-----------|:------------|:-----------|:-----------------|:-----------|:---------|:-------------|:----------|:------------|:---------------|:------------------|:-----------|:-----------|:------|:-------|:--------|:------------|:-------------------|:---------------|:--------------|:---------------|:-----------|:----------|:--------|:-------|:--------------------|:--------|:--------------------|:-------------|:------------|:-----------|:--------|:---------------|:----------------|:------------|:-------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | | | X | | | X | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | | X | | | X | | | | | | | | | | | | | | | | X | X | | X | | X | X | X | | X | X | X | X | X | X | X | X | X |
| CyberHarem/clemenceau_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:09:09+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:20:44+00:00 |
686d368c78224b85fe3d8c3e197d1ca4e2a38a36 |
# Dataset of yorck/ヨルク/约克DE (Azur Lane)
This is the dataset of yorck/ヨルク/约克DE (Azur Lane), containing 51 images and their tags.
The core tags of this character are `breasts, long_hair, large_breasts, white_hair, bangs, red_eyes, hat, black_headwear, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 51 | 97.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yorck_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 51 | 45.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yorck_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 135 | 104.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yorck_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 51 | 80.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yorck_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 135 | 160.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yorck_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yorck_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cleavage, official_alternate_costume, black_dress, black_thighhighs, choker, thighs, horns, very_long_hair, bare_shoulders, sitting, smile, brown_thighhighs, blush, thigh_strap, closed_mouth, evening_gown |
| 1 | 19 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cleavage, black_gloves, bare_shoulders, black_dress, smile, blush, fishnets, iron_cross, earrings, military_hat, white_thighhighs, closed_mouth, simple_background, white_background, peaked_cap |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | cleavage | official_alternate_costume | black_dress | black_thighhighs | choker | thighs | horns | very_long_hair | bare_shoulders | sitting | smile | brown_thighhighs | blush | thigh_strap | closed_mouth | evening_gown | black_gloves | fishnets | iron_cross | earrings | military_hat | white_thighhighs | simple_background | white_background | peaked_cap |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------|:-----------------------------|:--------------|:-------------------|:---------|:---------|:--------|:-----------------|:-----------------|:----------|:--------|:-------------------|:--------|:--------------|:---------------|:---------------|:---------------|:-----------|:-------------|:-----------|:---------------|:-------------------|:--------------------|:-------------------|:-------------|
| 0 | 21 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 19 |  |  |  |  |  | X | X | X | X | | X | | | | | | X | | X | | X | | X | | X | X | X | X | X | X | X | X | X |
| CyberHarem/yorck_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:09:27+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:29:18+00:00 |
0433d919af9554d231611ee3ea794a86fa4085d4 |
# Dataset of katsuragi/葛城/葛城 (Azur Lane)
This is the dataset of katsuragi/葛城/葛城 (Azur Lane), containing 41 images and their tags.
The core tags of this character are `breasts, long_hair, hair_ornament, twintails, small_breasts, blue_eyes, black_hair, earrings, bangs, green_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 41 | 58.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katsuragi_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 41 | 33.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katsuragi_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 90 | 65.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katsuragi_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 41 | 51.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katsuragi_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 90 | 91.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katsuragi_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/katsuragi_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 29 |  |  |  |  |  | 1girl, solo, looking_at_viewer, jewelry, open_mouth, detached_sleeves, simple_background, white_thighhighs, blush, hairband, smile, white_background, blue_hair |
| 1 | 7 |  |  |  |  |  | 1girl, fake_animal_ears, playboy_bunny, rabbit_ears, solo, looking_at_viewer, official_alternate_costume, rabbit_tail, red_leotard, black_gloves, black_pantyhose, hair_flower, covered_navel, open_mouth, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | jewelry | open_mouth | detached_sleeves | simple_background | white_thighhighs | blush | hairband | smile | white_background | blue_hair | fake_animal_ears | playboy_bunny | rabbit_ears | official_alternate_costume | rabbit_tail | red_leotard | black_gloves | black_pantyhose | hair_flower | covered_navel |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:----------|:-------------|:-------------------|:--------------------|:-------------------|:--------|:-----------|:--------|:-------------------|:------------|:-------------------|:----------------|:--------------|:-----------------------------|:--------------|:--------------|:---------------|:------------------|:--------------|:----------------|
| 0 | 29 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | X | | X | | | | | X | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/katsuragi_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:09:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:24:39+00:00 |
b9363405f2840f3b0bf65d10552b122d07791c8e |
# Dataset of yumi/雪泉/雪泉 (Azur Lane)
This is the dataset of yumi/雪泉/雪泉 (Azur Lane), containing 500 images and their tags.
The core tags of this character are `breasts, blue_eyes, short_hair, bow, grey_hair, hair_bow, large_breasts, white_bow, medium_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 680.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yumi_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 374.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yumi_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1222 | 781.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yumi_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 593.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yumi_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1222 | 1.12 GiB | [Download](https://huggingface.co/datasets/CyberHarem/yumi_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yumi_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, white_background, collarbone, simple_background, blush, bare_shoulders, navel, smile, bangs, huge_breasts, blue_bikini |
| 1 | 9 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, collarbone, looking_at_viewer, off_shoulder, simple_background, solo, white_background, white_kimono, bangs, low_neckline, blush, open_mouth, shiny_skin, huge_breasts, shiny_hair |
| 2 | 7 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, collarbone, looking_at_viewer, off_shoulder, parted_bangs, solo, white_kimono, low_neckline, upper_body, blush, closed_mouth, smile, wide_sleeves, snowflakes |
| 3 | 11 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, kimono, looking_at_viewer, off_shoulder, solo, low_neckline, collarbone, folding_fan, huge_breasts, smile |
| 4 | 23 |  |  |  |  |  | day, looking_at_viewer, cleavage, 1girl, outdoors, navel, smile, solo, blush, beach, ocean, blue_sky, cloud, blue_bikini, open_mouth, water, bare_shoulders, collarbone, side-tie_bikini_bottom |
| 5 | 12 |  |  |  |  |  | 1boy, 1girl, blush, collarbone, hetero, solo_focus, nipples, paizuri, huge_breasts, penis, breasts_squeezed_together, open_mouth, bare_shoulders, looking_at_viewer, nude, smile, sweat, bangs, mosaic_censoring |
| 6 | 7 |  |  |  |  |  | 1girl, cat_ears, looking_at_viewer, solo, navel, open_mouth, smile, blush, cat_tail, cleavage, simple_background, bare_shoulders, bell, white_background, cat_paws, gloves, white_panties |
| 7 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, open_mouth, simple_background, white_background, white_shirt, black_pantyhose, smile, black_hair, black_skirt, cleavage, long_sleeves, pencil_skirt |
| 8 | 8 |  |  |  |  |  | 1girl, blush, hetero, huge_breasts, penis, pussy, sex, shiny_hair, spread_legs, vaginal, 1boy, shiny_skin, solo_focus, bar_censor, navel, nipples, nude, open_mouth, sweat, collarbone, kimono |
| 9 | 12 |  |  |  |  |  | playboy_bunny, rabbit_ears, 1girl, fake_animal_ears, solo, detached_collar, looking_at_viewer, rabbit_tail, strapless_leotard, bare_shoulders, pantyhose, blush, cleavage, fishnets, white_background, white_leotard, simple_background, wrist_cuffs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | solo | white_background | collarbone | simple_background | blush | bare_shoulders | navel | smile | bangs | huge_breasts | blue_bikini | off_shoulder | white_kimono | low_neckline | open_mouth | shiny_skin | shiny_hair | parted_bangs | upper_body | closed_mouth | wide_sleeves | snowflakes | kimono | folding_fan | day | outdoors | beach | ocean | blue_sky | cloud | water | side-tie_bikini_bottom | 1boy | hetero | solo_focus | nipples | paizuri | penis | breasts_squeezed_together | nude | sweat | mosaic_censoring | cat_ears | cat_tail | bell | cat_paws | gloves | white_panties | white_shirt | black_pantyhose | black_hair | black_skirt | long_sleeves | pencil_skirt | pussy | sex | spread_legs | vaginal | bar_censor | playboy_bunny | rabbit_ears | fake_animal_ears | detached_collar | rabbit_tail | strapless_leotard | pantyhose | fishnets | white_leotard | wrist_cuffs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:-------|:-------------------|:-------------|:--------------------|:--------|:-----------------|:--------|:--------|:--------|:---------------|:--------------|:---------------|:---------------|:---------------|:-------------|:-------------|:-------------|:---------------|:-------------|:---------------|:---------------|:-------------|:---------|:--------------|:------|:-----------|:--------|:--------|:-----------|:--------|:--------|:-------------------------|:-------|:---------|:-------------|:----------|:----------|:--------|:----------------------------|:-------|:--------|:-------------------|:-----------|:-----------|:-------|:-----------|:---------|:----------------|:--------------|:------------------|:-------------|:--------------|:---------------|:---------------|:--------|:------|:--------------|:----------|:-------------|:----------------|:--------------|:-------------------|:------------------|:--------------|:--------------------|:------------|:-----------|:----------------|:--------------|
| 0 | 21 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | | X | | X | X | | X | | | | X | X | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | X | X | | X | | | X | | X | | X | | X | | X | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 23 |  |  |  |  |  | X | X | X | X | | X | | X | X | X | X | | | X | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 12 |  |  |  |  |  | X | | X | | | X | | X | X | | X | X | X | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | X | X | X | X | | X | X | X | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | X | X | X | | X | X | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 8 | 8 |  |  |  |  |  | X | | | | | X | | X | | X | | | X | | | | | X | X | X | | | | | | X | | | | | | | | | | X | X | X | X | | X | | X | X | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | |
| 9 | 12 |  |  |  |  |  | X | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/yumi_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:09:51+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T04:24:16+00:00 |
06da3818ad180c58eaa3dc0ef6966c8a8a88e3e9 |
# Dataset of k31/K31/K31 (Girls' Frontline)
This is the dataset of k31/K31/K31 (Girls' Frontline), containing 18 images and their tags.
The core tags of this character are `hair_ornament, pink_hair, long_hair, purple_eyes, headphones, breasts, bangs, hair_between_eyes, hair_intakes, x_hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 18 | 21.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k31_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 18 | 10.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k31_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 45 | 23.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k31_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 18 | 18.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k31_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 45 | 37.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k31_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/k31_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, solo, cleavage, holding, smile, looking_at_viewer, simple_background, white_background, blush, black_jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | holding | smile | looking_at_viewer | simple_background | white_background | blush | black_jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:----------|:--------|:--------------------|:--------------------|:-------------------|:--------|:---------------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/k31_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:27:01+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:31:37+00:00 |
9df828bf4e0a920bafc18f0deced4f7a373140b6 |
# Dataset of pp_19/PP-19/PP-19 (Girls' Frontline)
This is the dataset of pp_19/PP-19/PP-19 (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are `blue_eyes, short_hair, white_hair, bangs, breasts, medium_breasts, blunt_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 13.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_19_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 8.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_19_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 33 | 19.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_19_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 12.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_19_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 33 | 23.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_19_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pp_19_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, solo, looking_at_viewer, gloves, fur_trim, gun, boots, holding_weapon, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | gloves | fur_trim | gun | boots | holding_weapon | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:---------|:-----------|:------|:--------|:-----------------|:-------------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
| CyberHarem/pp_19_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:27:04+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:31:14+00:00 |
991bf5e0f55f3955235c1f4e313d5b42d35ea967 |
# Dataset of pp_90/PP-90/PP-90 (Girls' Frontline)
This is the dataset of pp_90/PP-90/PP-90 (Girls' Frontline), containing 20 images and their tags.
The core tags of this character are `twintails, drill_hair, grey_hair, red_eyes, twin_drills, bangs, hair_ornament, long_hair, ahoge, headphones, x_hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 20.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_90_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 13.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_90_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 41 | 25.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_90_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 19.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_90_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 41 | 34.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_90_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pp_90_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, solo, open_mouth, smile, white_shirt, black_gloves, looking_at_viewer, black_jacket, green_necktie, holding, long_sleeves, open_jacket, shorts, simple_background, blush, navel |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | open_mouth | smile | white_shirt | black_gloves | looking_at_viewer | black_jacket | green_necktie | holding | long_sleeves | open_jacket | shorts | simple_background | blush | navel |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------------|:--------|:--------------|:---------------|:--------------------|:---------------|:----------------|:----------|:---------------|:--------------|:---------|:--------------------|:--------|:--------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/pp_90_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:27:05+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:30:50+00:00 |
322de340a1c21971b9a26609da8b537475e9ff1f |
# Dataset of qbz_191/QBZ-191/QBZ-191 (Girls' Frontline)
This is the dataset of qbz_191/QBZ-191/QBZ-191 (Girls' Frontline), containing 22 images and their tags.
The core tags of this character are `long_hair, bangs, black_hair, breasts, orange_eyes, medium_breasts, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 22 | 30.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qbz_191_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 22 | 17.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qbz_191_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 58 | 36.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qbz_191_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 22 | 27.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qbz_191_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 58 | 48.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qbz_191_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/qbz_191_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, solo, gloves, looking_at_viewer, white_dress, standing, black_thighhighs, holding_gun, assault_rifle, closed_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | gloves | looking_at_viewer | white_dress | standing | black_thighhighs | holding_gun | assault_rifle | closed_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------|:--------------------|:--------------|:-----------|:-------------------|:--------------|:----------------|:---------------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/qbz_191_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:27:07+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:33:12+00:00 |
6aec2e875db362b516ff497f6f61beed32dee37a |
# Dataset of p08/P08/P08 (Girls' Frontline)
This is the dataset of p08/P08/P08 (Girls' Frontline), containing 20 images and their tags.
The core tags of this character are `short_hair, breasts, hat, brown_eyes, garrison_cap, medium_breasts, white_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 20.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p08_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 12.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p08_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 43 | 24.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p08_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 18.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p08_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 43 | 34.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p08_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/p08_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, looking_at_viewer, cleavage, long_sleeves, white_gloves, blue_jacket, boots, cropped_jacket, smile, thigh_strap, white_background, belt, black_leotard, blush, handgun, military_uniform, open_clothes, simple_background, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | cleavage | long_sleeves | white_gloves | blue_jacket | boots | cropped_jacket | smile | thigh_strap | white_background | belt | black_leotard | blush | handgun | military_uniform | open_clothes | simple_background | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-----------|:---------------|:---------------|:--------------|:--------|:-----------------|:--------|:--------------|:-------------------|:-------|:----------------|:--------|:----------|:-------------------|:---------------|:--------------------|:-----------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/p08_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:27:11+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:31:47+00:00 |
f1dcc898be2617bcf68d9f5690de03e9d5041db2 |
# Dataset of t_5000/T-5000/T-5000 (Girls' Frontline)
This is the dataset of t_5000/T-5000/T-5000 (Girls' Frontline), containing 17 images and their tags.
The core tags of this character are `long_hair, red_hair, blue_eyes, breasts, hair_between_eyes, very_long_hair, bangs, medium_breasts, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 17.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_5000_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 11.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_5000_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 31 | 19.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_5000_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 16.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_5000_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 31 | 26.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_5000_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/t_5000_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, solo, crop_top, looking_at_viewer, midriff, navel, fingerless_gloves, short_shorts, black_gloves, blush, thigh_strap, black_shirt, full_body, pouch, rifle, simple_background, socks, white_jacket, white_shorts, belt, bright_pupils, eyes_visible_through_hair, holding, single_thighhigh, standing, sweatdrop, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | crop_top | looking_at_viewer | midriff | navel | fingerless_gloves | short_shorts | black_gloves | blush | thigh_strap | black_shirt | full_body | pouch | rifle | simple_background | socks | white_jacket | white_shorts | belt | bright_pupils | eyes_visible_through_hair | holding | single_thighhigh | standing | sweatdrop | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------------------|:----------|:--------|:--------------------|:---------------|:---------------|:--------|:--------------|:--------------|:------------|:--------|:--------|:--------------------|:--------|:---------------|:---------------|:-------|:----------------|:----------------------------|:----------|:-------------------|:-----------|:------------|:-------------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/t_5000_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:27:22+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:31:34+00:00 |
7adacea4af18483722f8aa06e313c25c82d12d36 |
# Dataset of ff_m249saw/FFM249SAW/M249SAW (Girls' Frontline)
This is the dataset of ff_m249saw/FFM249SAW/M249SAW (Girls' Frontline), containing 19 images and their tags.
The core tags of this character are `blue_hair, long_hair, yellow_eyes, breasts, very_long_hair, large_breasts, bangs, eyewear_on_head, medium_breasts, sunglasses`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 29.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ff_m249saw_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 16.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ff_m249saw_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 44 | 34.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ff_m249saw_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 26.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ff_m249saw_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 44 | 51.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ff_m249saw_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ff_m249saw_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, bikini, blush, cleavage, solo, bubble_blowing, chewing_gum, collarbone, jacket, navel, fur_trim |
| 1 | 5 |  |  |  |  |  | 1girl, crop_top, midriff, navel, bubble_blowing, fingerless_gloves, short_shorts, solo, chewing_gum, cowboy_shot, hood |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | bikini | blush | cleavage | solo | bubble_blowing | chewing_gum | collarbone | jacket | navel | fur_trim | crop_top | midriff | fingerless_gloves | short_shorts | cowboy_shot | hood |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:---------|:--------|:-----------|:-------|:-----------------|:--------------|:-------------|:---------|:--------|:-----------|:-----------|:----------|:--------------------|:---------------|:--------------|:-------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | | | | X | X | X | | | X | | X | X | X | X | X | X |
| CyberHarem/ff_m249saw_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:27:27+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:31:48+00:00 |
0b9594e02a572e0476a655b425532acdf23a0e11 | dread1900/Marcus_Holloway | [
"region:us"
] | 2024-01-14T02:38:55+00:00 | {} | 2024-01-14T02:42:47+00:00 |
|
27d49056dd5934d013a9cd75820d3b78b78a2296 |
# Dataset Card for Evaluation run of macadeliccc/SOLAR-math-2x10.7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/SOLAR-math-2x10.7b](https://huggingface.co/macadeliccc/SOLAR-math-2x10.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__SOLAR-math-2x10.7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T02:37:03.730641](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__SOLAR-math-2x10.7b/blob/main/results_2024-01-14T02-37-03.730641.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.672153123323673,
"acc_stderr": 0.03128879331345752,
"acc_norm": 0.6725032879829345,
"acc_norm_stderr": 0.031933166428242975,
"mc1": 0.4810281517747858,
"mc1_stderr": 0.01749089640576235,
"mc2": 0.642058591491927,
"mc2_stderr": 0.015391497190020965
},
"harness|arc:challenge|25": {
"acc": 0.6493174061433447,
"acc_stderr": 0.013944635930726099,
"acc_norm": 0.6843003412969283,
"acc_norm_stderr": 0.013582571095815291
},
"harness|hellaswag|10": {
"acc": 0.6778530173272257,
"acc_stderr": 0.004663439181149046,
"acc_norm": 0.8630750846444931,
"acc_norm_stderr": 0.0034306550069275825
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099583,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099583
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.02845015479411864,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.02845015479411864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4708994708994709,
"acc_stderr": 0.025707658614154957,
"acc_norm": 0.4708994708994709,
"acc_norm_stderr": 0.025707658614154957
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8838383838383839,
"acc_stderr": 0.022828881775249377,
"acc_norm": 0.8838383838383839,
"acc_norm_stderr": 0.022828881775249377
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033446,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033446
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524586,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524586
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.03017680828897434,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.03017680828897434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801584,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801584
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.022535526352692705,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.022535526352692705
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7354260089686099,
"acc_stderr": 0.029605103217038332,
"acc_norm": 0.7354260089686099,
"acc_norm_stderr": 0.029605103217038332
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517934,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517934
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993445,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993445
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4134078212290503,
"acc_stderr": 0.01646981492840617,
"acc_norm": 0.4134078212290503,
"acc_norm_stderr": 0.01646981492840617
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817962,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262196,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262196
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5104302477183833,
"acc_stderr": 0.012767457253930648,
"acc_norm": 0.5104302477183833,
"acc_norm_stderr": 0.012767457253930648
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7683823529411765,
"acc_stderr": 0.025626533803777562,
"acc_norm": 0.7683823529411765,
"acc_norm_stderr": 0.025626533803777562
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.018718067052623227,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.018718067052623227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7795918367346939,
"acc_stderr": 0.02653704531214529,
"acc_norm": 0.7795918367346939,
"acc_norm_stderr": 0.02653704531214529
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4810281517747858,
"mc1_stderr": 0.01749089640576235,
"mc2": 0.642058591491927,
"mc2_stderr": 0.015391497190020965
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.010470796496781074
},
"harness|gsm8k|5": {
"acc": 0.7103866565579985,
"acc_stderr": 0.01249392734865963
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_macadeliccc__SOLAR-math-2x10.7b | [
"region:us"
] | 2024-01-14T02:39:24+00:00 | {"pretty_name": "Evaluation run of macadeliccc/SOLAR-math-2x10.7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [macadeliccc/SOLAR-math-2x10.7b](https://huggingface.co/macadeliccc/SOLAR-math-2x10.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__SOLAR-math-2x10.7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T02:37:03.730641](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__SOLAR-math-2x10.7b/blob/main/results_2024-01-14T02-37-03.730641.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.672153123323673,\n \"acc_stderr\": 0.03128879331345752,\n \"acc_norm\": 0.6725032879829345,\n \"acc_norm_stderr\": 0.031933166428242975,\n \"mc1\": 0.4810281517747858,\n \"mc1_stderr\": 0.01749089640576235,\n \"mc2\": 0.642058591491927,\n \"mc2_stderr\": 0.015391497190020965\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6493174061433447,\n \"acc_stderr\": 0.013944635930726099,\n \"acc_norm\": 0.6843003412969283,\n \"acc_norm_stderr\": 0.013582571095815291\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6778530173272257,\n \"acc_stderr\": 0.004663439181149046,\n \"acc_norm\": 0.8630750846444931,\n \"acc_norm_stderr\": 0.0034306550069275825\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099583,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099583\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.02845015479411864,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.02845015479411864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4708994708994709,\n \"acc_stderr\": 0.025707658614154957,\n \"acc_norm\": 0.4708994708994709,\n \"acc_norm_stderr\": 0.025707658614154957\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8838383838383839,\n \"acc_stderr\": 0.022828881775249377,\n \"acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.022828881775249377\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033446,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033446\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657262,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657262\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524586,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524586\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.03017680828897434,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.03017680828897434\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801584,\n \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801584\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7354260089686099,\n \"acc_stderr\": 0.029605103217038332,\n \"acc_norm\": 0.7354260089686099,\n \"acc_norm_stderr\": 0.029605103217038332\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517934,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517934\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993445,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993445\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4134078212290503,\n \"acc_stderr\": 0.01646981492840617,\n \"acc_norm\": 0.4134078212290503,\n \"acc_norm_stderr\": 0.01646981492840617\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.02355083135199509,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02355083135199509\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817962,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817962\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262196,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262196\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5283687943262412,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5104302477183833,\n \"acc_stderr\": 0.012767457253930648,\n \"acc_norm\": 0.5104302477183833,\n \"acc_norm_stderr\": 0.012767457253930648\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7683823529411765,\n \"acc_stderr\": 0.025626533803777562,\n \"acc_norm\": 0.7683823529411765,\n \"acc_norm_stderr\": 0.025626533803777562\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.018718067052623227,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.018718067052623227\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7795918367346939,\n \"acc_stderr\": 0.02653704531214529,\n \"acc_norm\": 0.7795918367346939,\n \"acc_norm_stderr\": 0.02653704531214529\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4810281517747858,\n \"mc1_stderr\": 0.01749089640576235,\n \"mc2\": 0.642058591491927,\n \"mc2_stderr\": 0.015391497190020965\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781074\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7103866565579985,\n \"acc_stderr\": 0.01249392734865963\n }\n}\n```", "repo_url": "https://huggingface.co/macadeliccc/SOLAR-math-2x10.7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|arc:challenge|25_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|gsm8k|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hellaswag|10_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|winogrande|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["results_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T02-37-03.730641.parquet"]}]}]} | 2024-01-14T02:39:46+00:00 |
8c1b5ac7524acadf56210e567c964b0ad72eb6f4 |
# Dataset Card for Evaluation run of gagan3012/Multirial
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gagan3012/Multirial](https://huggingface.co/gagan3012/Multirial) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gagan3012__Multirial",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T02:38:13.132787](https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__Multirial/blob/main/results_2024-01-14T02-38-13.132787.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6087068516861436,
"acc_stderr": 0.032980911385021405,
"acc_norm": 0.6135781515215905,
"acc_norm_stderr": 0.03364558465127436,
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5469648449991642,
"mc2_stderr": 0.01540322430997804
},
"harness|arc:challenge|25": {
"acc": 0.5947098976109215,
"acc_stderr": 0.01434686906022933,
"acc_norm": 0.6322525597269625,
"acc_norm_stderr": 0.014090995618168478
},
"harness|hellaswag|10": {
"acc": 0.6061541525592511,
"acc_stderr": 0.0048760280379419405,
"acc_norm": 0.7956582354112727,
"acc_norm_stderr": 0.0040239573344619875
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099834,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.02698528957655274,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.02698528957655274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5923076923076923,
"acc_stderr": 0.02491524398598785,
"acc_norm": 0.5923076923076923,
"acc_norm_stderr": 0.02491524398598785
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.03156663099215416,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.03156663099215416
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.01697028909045803,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.01697028909045803
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.0286265479124374,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.0286265479124374
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724147,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724147
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597542,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597542
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.01492744710193716,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.01492744710193716
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069706,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069706
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33854748603351953,
"acc_stderr": 0.01582670009648135,
"acc_norm": 0.33854748603351953,
"acc_norm_stderr": 0.01582670009648135
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44589308996088656,
"acc_stderr": 0.012695244711379772,
"acc_norm": 0.44589308996088656,
"acc_norm_stderr": 0.012695244711379772
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.01955964680921593,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.01955964680921593
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.029393609319879804,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.029393609319879804
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5469648449991642,
"mc2_stderr": 0.01540322430997804
},
"harness|winogrande|5": {
"acc": 0.7529597474348856,
"acc_stderr": 0.012121402942855576
},
"harness|gsm8k|5": {
"acc": 0.4040940106141016,
"acc_stderr": 0.013516752972721716
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_gagan3012__Multirial | [
"region:us"
] | 2024-01-14T02:40:30+00:00 | {"pretty_name": "Evaluation run of gagan3012/Multirial", "dataset_summary": "Dataset automatically created during the evaluation run of model [gagan3012/Multirial](https://huggingface.co/gagan3012/Multirial) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gagan3012__Multirial\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T02:38:13.132787](https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__Multirial/blob/main/results_2024-01-14T02-38-13.132787.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6087068516861436,\n \"acc_stderr\": 0.032980911385021405,\n \"acc_norm\": 0.6135781515215905,\n \"acc_norm_stderr\": 0.03364558465127436,\n \"mc1\": 0.37576499388004897,\n \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5469648449991642,\n \"mc2_stderr\": 0.01540322430997804\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5947098976109215,\n \"acc_stderr\": 0.01434686906022933,\n \"acc_norm\": 0.6322525597269625,\n \"acc_norm_stderr\": 0.014090995618168478\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6061541525592511,\n \"acc_stderr\": 0.0048760280379419405,\n \"acc_norm\": 0.7956582354112727,\n \"acc_norm_stderr\": 0.0040239573344619875\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099834,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099834\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n \"acc_stderr\": 0.02698528957655274,\n \"acc_norm\": 0.6580645161290323,\n \"acc_norm_stderr\": 0.02698528957655274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.02491524398598785,\n \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.02491524398598785\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.03156663099215416,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.03156663099215416\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045803,\n \"acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045803\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.0286265479124374,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.0286265479124374\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724147,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724147\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597542,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597542\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n \"acc_stderr\": 0.01492744710193716,\n \"acc_norm\": 0.7752234993614304,\n \"acc_norm_stderr\": 0.01492744710193716\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069706,\n \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069706\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n \"acc_stderr\": 0.012695244711379772,\n \"acc_norm\": 0.44589308996088656,\n \"acc_norm_stderr\": 0.012695244711379772\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.01955964680921593,\n \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.01955964680921593\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.029393609319879804,\n \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.029393609319879804\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.7114427860696517,\n \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37576499388004897,\n \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5469648449991642,\n \"mc2_stderr\": 0.01540322430997804\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.012121402942855576\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4040940106141016,\n \"acc_stderr\": 0.013516752972721716\n }\n}\n```", "repo_url": "https://huggingface.co/gagan3012/Multirial", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|arc:challenge|25_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|gsm8k|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hellaswag|10_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|winogrande|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["results_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T02-38-13.132787.parquet"]}]}]} | 2024-01-14T02:40:54+00:00 |
4d51106e5e48a11426cc32cfdf465589ca46826e | paulykim/MPMedical_2 | [
"region:us"
] | 2024-01-14T02:54:55+00:00 | {} | 2024-01-14T02:54:55+00:00 |
|
b9993eca96a202eea3e9d8f77eca151d4c252fed |
# Dataset of a_91/A-91/A-91 (Girls' Frontline)
This is the dataset of a_91/A-91/A-91 (Girls' Frontline), containing 20 images and their tags.
The core tags of this character are `blonde_hair, long_hair, breasts, yellow_eyes, hair_between_eyes, mole, mole_under_eye, bangs, large_breasts, hat, medium_breasts, multicolored_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 29.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_91_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 15.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_91_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 46 | 29.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_91_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 24.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_91_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 46 | 42.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_91_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/a_91_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, blush, smile, looking_at_viewer, solo, gloves, open_mouth, black_bodysuit, holding, cleavage, drunk |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | smile | looking_at_viewer | solo | gloves | open_mouth | black_bodysuit | holding | cleavage | drunk |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:--------------------|:-------|:---------|:-------------|:-----------------|:----------|:-----------|:--------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/a_91_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:55:02+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:59:23+00:00 |
1ab7ff4f8dca159b31f1fd54c8437302be50679b |
# Dataset of fx_05/FX-05/FX-05 (Girls' Frontline)
This is the dataset of fx_05/FX-05/FX-05 (Girls' Frontline), containing 14 images and their tags.
The core tags of this character are `blue_eyes, long_hair, breasts, large_breasts, grey_hair, hat, bangs, very_long_hair, black_headwear, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 14 | 18.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fx_05_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 14 | 11.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fx_05_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 33 | 22.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fx_05_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 14 | 17.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fx_05_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 33 | 31.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fx_05_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fx_05_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, solo, looking_at_viewer, pantyhose, jewelry, holding, smile, assault_rifle, jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | pantyhose | jewelry | holding | smile | assault_rifle | jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:------------|:----------|:----------|:--------|:----------------|:---------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
| CyberHarem/fx_05_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:55:07+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:00:10+00:00 |
6f13738d010c8a3705dc75b22c5e0c419f5125fc |
# Dataset of ads/ADS/ADS (Girls' Frontline)
This is the dataset of ads/ADS/ADS (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are `blue_eyes, blue_hair, long_hair, hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 20.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ads_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 10.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ads_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 30 | 20.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ads_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 17.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ads_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 30 | 30.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ads_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ads_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, white_gloves, barefoot, blue_dress, puffy_short_sleeves, full_body, see-through, white_background, assault_rifle, closed_mouth, holding, simple_background, white_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | white_gloves | barefoot | blue_dress | puffy_short_sleeves | full_body | see-through | white_background | assault_rifle | closed_mouth | holding | simple_background | white_dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:---------------|:-----------|:-------------|:----------------------|:------------|:--------------|:-------------------|:----------------|:---------------|:----------|:--------------------|:--------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ads_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:55:13+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:58:56+00:00 |
ec70b1f436a4c16fd92d067dca611aa6526cd21c | llm-aes/asap-7-original | [
"region:us"
] | 2024-01-14T02:55:35+00:00 | {"dataset_info": {"features": [{"name": "essay_id", "dtype": "int64"}, {"name": "essay_set", "dtype": "int64"}, {"name": "essay", "dtype": "string"}, {"name": "rater1_domain1", "dtype": "int64"}, {"name": "rater2_domain1", "dtype": "int64"}, {"name": "domain1_score", "dtype": "int64"}, {"name": "rater1_trait1", "dtype": "float64"}, {"name": "rater1_trait2", "dtype": "float64"}, {"name": "rater1_trait3", "dtype": "float64"}, {"name": "rater1_trait4", "dtype": "float64"}, {"name": "rater2_trait1", "dtype": "float64"}, {"name": "rater2_trait2", "dtype": "float64"}, {"name": "rater2_trait3", "dtype": "float64"}, {"name": "rater2_trait4", "dtype": "float64"}, {"name": "rubrics", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 4907573, "num_examples": 1569}], "download_size": 842177, "dataset_size": 4907573}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T02:55:36+00:00 |
|
ea4c46aa7c677021de9512d6151a785b17e217c5 | llm-aes/asap-8-original | [
"region:us"
] | 2024-01-14T02:55:36+00:00 | {"dataset_info": {"features": [{"name": "essay_id", "dtype": "int64"}, {"name": "essay_set", "dtype": "int64"}, {"name": "essay", "dtype": "string"}, {"name": "rater1_domain1", "dtype": "int64"}, {"name": "rater2_domain1", "dtype": "int64"}, {"name": "domain1_score", "dtype": "int64"}, {"name": "rater1_trait1", "dtype": "float64"}, {"name": "rater1_trait2", "dtype": "float64"}, {"name": "rater1_trait3", "dtype": "float64"}, {"name": "rater1_trait4", "dtype": "float64"}, {"name": "rater1_trait5", "dtype": "float64"}, {"name": "rater1_trait6", "dtype": "float64"}, {"name": "rater2_trait1", "dtype": "float64"}, {"name": "rater2_trait2", "dtype": "float64"}, {"name": "rater2_trait3", "dtype": "float64"}, {"name": "rater2_trait4", "dtype": "float64"}, {"name": "rater2_trait5", "dtype": "float64"}, {"name": "rater2_trait6", "dtype": "float64"}, {"name": "rubrics", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 2672885, "num_examples": 723}], "download_size": 1352624, "dataset_size": 2672885}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T02:55:37+00:00 |
|
b7bd5749fa80cb8d90c3769093f58792a4199cd3 |
# Dataset of hatsushimo/初霜/初霜 (Azur Lane)
This is the dataset of hatsushimo/初霜/初霜 (Azur Lane), containing 11 images and their tags.
The core tags of this character are `animal_ears, hair_ornament, pink_hair, animal_ear_fluff, cat_ears, hairclip, red_eyes, ahoge, bangs, hair_between_eyes, cat_tail, fang, long_hair, tail, breasts, twintails, cat_girl, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 16.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsushimo_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 10.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsushimo_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 29 | 21.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsushimo_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 14.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsushimo_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 29 | 29.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsushimo_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hatsushimo_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, open_mouth, :d, bare_shoulders, kimono, long_sleeves, thighhighs, wide_sleeves, choker, jingle_bell, cleavage, collarbone, garter_straps, obi, pleated_skirt, simple_background, underwear, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | solo | open_mouth | :d | bare_shoulders | kimono | long_sleeves | thighhighs | wide_sleeves | choker | jingle_bell | cleavage | collarbone | garter_straps | obi | pleated_skirt | simple_background | underwear | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:-------------|:-----|:-----------------|:---------|:---------------|:-------------|:---------------|:---------|:--------------|:-----------|:-------------|:----------------|:------|:----------------|:--------------------|:------------|:-------------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/hatsushimo_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:57:47+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:30:09+00:00 |
40c3d282435bcea229d7bf22183c81146630cd58 |
# Dataset of georg_thiele/ゲオルク・ティーレ/Z2 (Azur Lane)
This is the dataset of georg_thiele/ゲオルク・ティーレ/Z2 (Azur Lane), containing 13 images and their tags.
The core tags of this character are `bangs, red_eyes, long_hair, braid, black_hair, brown_hair, hat, beret, bow, hair_bun, red_bow, single_hair_bun, single_side_bun`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 12.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georg_thiele_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 7.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georg_thiele_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 18 | 13.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georg_thiele_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 10.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georg_thiele_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 18 | 18.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georg_thiele_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/georg_thiele_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, looking_at_viewer, solo, full_body, long_sleeves, obi, closed_mouth, simple_background, sitting, standing, white_background, wide_sleeves, barefoot, black_footwear, boots, candy_apple, holding_food, jacket, striped_kimono, yukata |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | full_body | long_sleeves | obi | closed_mouth | simple_background | sitting | standing | white_background | wide_sleeves | barefoot | black_footwear | boots | candy_apple | holding_food | jacket | striped_kimono | yukata |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:------------|:---------------|:------|:---------------|:--------------------|:----------|:-----------|:-------------------|:---------------|:-----------|:-----------------|:--------|:--------------|:---------------|:---------|:-----------------|:---------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/georg_thiele_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:57:53+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:09:17+00:00 |
ae9fa160b4d65a8b6b0c50b6d9a50ae490e0bacd |
# Dataset of ise/伊勢/伊势 (Azur Lane)
This is the dataset of ise/伊勢/伊势 (Azur Lane), containing 11 images and their tags.
The core tags of this character are `animal_ears, breasts, fox_ears, red_hair, fox_tail, tail, hair_ornament, ponytail, bangs, large_breasts, long_hair, medium_breasts, red_eyes, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 14.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ise_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 8.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ise_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 23 | 15.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ise_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 12.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ise_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 23 | 24.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ise_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ise_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, solo, midriff, navel, smile, cleavage, fingerless_gloves, hakama_skirt, simple_background, collarbone, black_gloves, full_body, standing, sword, white_background, detached_sleeves, hip_vent, holding_weapon |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | midriff | navel | smile | cleavage | fingerless_gloves | hakama_skirt | simple_background | collarbone | black_gloves | full_body | standing | sword | white_background | detached_sleeves | hip_vent | holding_weapon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:----------|:--------|:--------|:-----------|:--------------------|:---------------|:--------------------|:-------------|:---------------|:------------|:-----------|:--------|:-------------------|:-------------------|:-----------|:-----------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ise_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:57:57+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:01:19+00:00 |
7c7fd84cba2d68d4235219842a9e1f4c9c324100 |
# Dataset of laffey_ii/ラフィーII/拉菲II (Azur Lane)
This is the dataset of laffey_ii/ラフィーII/拉菲II (Azur Lane), containing 34 images and their tags.
The core tags of this character are `long_hair, twintails, white_hair, rabbit_ears, red_eyes, animal_ears, bangs, hairband, fake_animal_ears, very_long_hair, breasts, hair_between_eyes, ribbon, small_breasts, hair_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 34 | 59.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laffey_ii_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 34 | 26.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laffey_ii_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 95 | 66.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laffey_ii_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 34 | 49.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laffey_ii_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 95 | 102.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laffey_ii_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/laffey_ii_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, solo, bare_shoulders, looking_at_viewer, blush, white_thighhighs, long_sleeves, white_dress, off_shoulder, collarbone, parted_lips, simple_background, sleeves_past_fingers |
| 1 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, playboy_bunny, solo, white_pantyhose, cup, full_body, official_alternate_costume, strapless_leotard, blue_leotard, blush, holding_tray, no_shoes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | bare_shoulders | looking_at_viewer | blush | white_thighhighs | long_sleeves | white_dress | off_shoulder | collarbone | parted_lips | simple_background | sleeves_past_fingers | playboy_bunny | white_pantyhose | cup | full_body | official_alternate_costume | strapless_leotard | blue_leotard | holding_tray | no_shoes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:--------------------|:--------|:-------------------|:---------------|:--------------|:---------------|:-------------|:--------------|:--------------------|:-----------------------|:----------------|:------------------|:------|:------------|:-----------------------------|:--------------------|:---------------|:---------------|:-----------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/laffey_ii_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T03:03:09+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:13:55+00:00 |
877e35169d6ff11a14da6c9e667a53c64d24d2fe | open-llm-leaderboard/details_moreh__MoMo-72B-lora-1.8.4-DPO | [
"region:us"
] | 2024-01-14T03:03:43+00:00 | {} | 2024-01-25T08:19:56+00:00 |
|
a75789e4090380407b6ada0f8d443eb5e7a1300a | # Data Format(s)
The lichess dataset with 16M chess games was used. These games were transcoded into UCI notation, with the minor modification that the BOS token (`;`) is added to every game and the EOS token `#` is added whenever there's a checkmate.
# Character-based encoding vocab
Tokenization is simplified by using a vocabulary with 23 characters in the following order:
```
[' ', '1', '2', '3', '4', '5', '6', '7', '8', ';', '#', 'a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'n', 'r', 'q', 'k']
```
With the exception of `'b'`, all other tokens have a unique purpose in the dataset.
This vocab also makes intuitive the encoding/decoding of board squares for human readers. Games always begin with a 9. Columns (a through h) are encoded to 11 through 18. Rows are encoded to their integer values 1 through 8. So a move `e2e4` becomes `[15, 2, 15, 4]`, and a move `g1f3` becomes `[17, 1, 16, 3]`. Which is convenient because the '7' in the 17 and the '6' from the 16 correspond to the 7th and 6th columns respectively. Likewise, breaks between moves are encoded as `0` and checkmate is encoded as `10`. So the sequence `b1b8 a8b8#` becomes `[12,1,12,8,0,11,8,12,8,10]` | austindavis/chess_mi | [
"task_categories:text-generation",
"size_categories:10M<n<100M",
"region:us"
] | 2024-01-14T03:10:44+00:00 | {"size_categories": ["10M<n<100M"], "task_categories": ["text-generation"], "pretty_name": "Chess Mech Interp"} | 2024-01-15T08:52:07+00:00 |
fd55eb4a59f94c4e20d04a7efa858cafc3a3c2c3 |
# Dataset of m38/M38/伯莱塔38型 (Girls' Frontline)
This is the dataset of m38/M38/伯莱塔38型 (Girls' Frontline), containing 10 images and their tags.
The core tags of this character are `blue_eyes, long_hair, ahoge, hat, bangs, beret, hair_ornament, brown_hair, hairclip, breasts, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 13.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m38_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 6.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m38_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 22 | 13.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m38_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 10.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m38_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 22 | 20.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m38_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/m38_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, simple_background, solo, white_shirt, long_sleeves, pleated_skirt, submachine_gun, white_background, white_thighhighs, black_footwear, black_skirt, closed_mouth, holding_gun, jacket, military_uniform, red_necktie, loafers, belt, blush, collared_shirt, full_body, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | simple_background | solo | white_shirt | long_sleeves | pleated_skirt | submachine_gun | white_background | white_thighhighs | black_footwear | black_skirt | closed_mouth | holding_gun | jacket | military_uniform | red_necktie | loafers | belt | blush | collared_shirt | full_body | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------------------|:-------|:--------------|:---------------|:----------------|:-----------------|:-------------------|:-------------------|:-----------------|:--------------|:---------------|:--------------|:---------|:-------------------|:--------------|:----------|:-------|:--------|:-----------------|:------------|:-----------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/m38_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T03:22:19+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:26:41+00:00 |
dc00c3ed78f3d258fa32c6d169a032c5670188f2 |
# Dataset of js05/JS05/JS05 (Girls' Frontline)
This is the dataset of js05/JS05/JS05 (Girls' Frontline), containing 13 images and their tags.
The core tags of this character are `short_hair, green_eyes, grey_hair, bangs, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 14.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/js05_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 9.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/js05_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 30 | 18.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/js05_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 14.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/js05_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 30 | 25.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/js05_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/js05_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, solo, black_gloves, looking_at_viewer, simple_background, fingerless_gloves, closed_mouth, jewelry, smile, white_background, bare_shoulders, choker, elbow_gloves, holding, skirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | black_gloves | looking_at_viewer | simple_background | fingerless_gloves | closed_mouth | jewelry | smile | white_background | bare_shoulders | choker | elbow_gloves | holding | skirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------------------|:--------------------|:--------------------|:---------------|:----------|:--------|:-------------------|:-----------------|:---------|:---------------|:----------|:--------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/js05_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T03:22:22+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:25:43+00:00 |
5baa6e4e0b323b5e4d81f30614eb538e96a02375 |
# Dataset of f1/F1/F1 (Girls' Frontline)
This is the dataset of f1/F1/F1 (Girls' Frontline), containing 10 images and their tags.
The core tags of this character are `hat, blue_eyes, brown_hair, long_hair, twintails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 10.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/f1_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 6.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/f1_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 22 | 13.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/f1_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 10.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/f1_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 22 | 18.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/f1_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/f1_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, open_mouth, holding, looking_at_viewer, boots, fingerless_gloves, scarf, :d, rifle, shirt, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | open_mouth | holding | looking_at_viewer | boots | fingerless_gloves | scarf | :d | rifle | shirt | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------------|:----------|:--------------------|:--------|:--------------------|:--------|:-----|:--------|:--------|:--------------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/f1_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T03:22:27+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:25:29+00:00 |
909e14747ebbc746972e70564f08b782d88874bd |
# Dataset Card for Evaluation run of jefferylovely/AthenaImaniMaven
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jefferylovely/AthenaImaniMaven](https://huggingface.co/jefferylovely/AthenaImaniMaven) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jefferylovely__AthenaImaniMaven",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T03:41:28.738425](https://huggingface.co/datasets/open-llm-leaderboard/details_jefferylovely__AthenaImaniMaven/blob/main/results_2024-01-14T03-41-28.738425.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5906347296238645,
"acc_stderr": 0.03376761541575429,
"acc_norm": 0.5954572418779962,
"acc_norm_stderr": 0.03446646542818655,
"mc1": 0.42105263157894735,
"mc1_stderr": 0.01728393624813649,
"mc2": 0.5857820006375237,
"mc2_stderr": 0.015441927798310004
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.014361097288449698,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759084
},
"harness|hellaswag|10": {
"acc": 0.6552479585739892,
"acc_stderr": 0.004743160034271149,
"acc_norm": 0.8465445130452102,
"acc_norm_stderr": 0.0035968938961909148
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6452830188679245,
"acc_stderr": 0.029445175328199586,
"acc_norm": 0.6452830188679245,
"acc_norm_stderr": 0.029445175328199586
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.036812296333943194,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.036812296333943194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.04966570903978529,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.04966570903978529
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851105,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851105
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.02659308451657228,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.02659308451657228
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.028408953626245282,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.028408953626245282
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130952,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130952
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652456,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7853211009174312,
"acc_stderr": 0.01760430414925648,
"acc_norm": 0.7853211009174312,
"acc_norm_stderr": 0.01760430414925648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.03198001660115071,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.03198001660115071
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.029696338713422872,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.029696338713422872
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560396,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560396
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398677,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398677
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977243,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977243
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.293854748603352,
"acc_stderr": 0.015235075776719613,
"acc_norm": 0.293854748603352,
"acc_norm_stderr": 0.015235075776719613
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6237942122186495,
"acc_stderr": 0.02751392568354943,
"acc_norm": 0.6237942122186495,
"acc_norm_stderr": 0.02751392568354943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.02691500301138016,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.02691500301138016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144363,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144363
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41264667535853977,
"acc_stderr": 0.012573836633799015,
"acc_norm": 0.41264667535853977,
"acc_norm_stderr": 0.012573836633799015
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5955882352941176,
"acc_stderr": 0.02981263070156974,
"acc_norm": 0.5955882352941176,
"acc_norm_stderr": 0.02981263070156974
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.019910377463105932,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.019910377463105932
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322605,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322605
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764003,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764003
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42105263157894735,
"mc1_stderr": 0.01728393624813649,
"mc2": 0.5857820006375237,
"mc2_stderr": 0.015441927798310004
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.011793015817663595
},
"harness|gsm8k|5": {
"acc": 0.3502653525398029,
"acc_stderr": 0.01314040945557127
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jefferylovely__AthenaImaniMaven | [
"region:us"
] | 2024-01-14T03:23:58+00:00 | {"pretty_name": "Evaluation run of jefferylovely/AthenaImaniMaven", "dataset_summary": "Dataset automatically created during the evaluation run of model [jefferylovely/AthenaImaniMaven](https://huggingface.co/jefferylovely/AthenaImaniMaven) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jefferylovely__AthenaImaniMaven\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T03:41:28.738425](https://huggingface.co/datasets/open-llm-leaderboard/details_jefferylovely__AthenaImaniMaven/blob/main/results_2024-01-14T03-41-28.738425.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5906347296238645,\n \"acc_stderr\": 0.03376761541575429,\n \"acc_norm\": 0.5954572418779962,\n \"acc_norm_stderr\": 0.03446646542818655,\n \"mc1\": 0.42105263157894735,\n \"mc1_stderr\": 0.01728393624813649,\n \"mc2\": 0.5857820006375237,\n \"mc2_stderr\": 0.015441927798310004\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449698,\n \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759084\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6552479585739892,\n \"acc_stderr\": 0.004743160034271149,\n \"acc_norm\": 0.8465445130452102,\n \"acc_norm_stderr\": 0.0035968938961909148\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6452830188679245,\n \"acc_stderr\": 0.029445175328199586,\n \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.029445175328199586\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.04966570903978529,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.04966570903978529\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.032671518489247764,\n \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.032671518489247764\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851105,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851105\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n \"acc_stderr\": 0.02659308451657228,\n \"acc_norm\": 0.6774193548387096,\n \"acc_norm_stderr\": 0.02659308451657228\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.028408953626245282,\n \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.028408953626245282\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130952,\n \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130952\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652456,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652456\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7853211009174312,\n \"acc_stderr\": 0.01760430414925648,\n \"acc_norm\": 0.7853211009174312,\n \"acc_norm_stderr\": 0.01760430414925648\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.03198001660115071,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.03198001660115071\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7046413502109705,\n \"acc_stderr\": 0.029696338713422872,\n \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.029696338713422872\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n \"acc_stderr\": 0.015246803197398677,\n \"acc_norm\": 0.7611749680715197,\n \"acc_norm_stderr\": 0.015246803197398677\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977243,\n \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977243\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.293854748603352,\n \"acc_stderr\": 0.015235075776719613,\n \"acc_norm\": 0.293854748603352,\n \"acc_norm_stderr\": 0.015235075776719613\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.6237942122186495,\n \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.02691500301138016,\n \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.02691500301138016\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144363,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144363\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41264667535853977,\n \"acc_stderr\": 0.012573836633799015,\n \"acc_norm\": 0.41264667535853977,\n \"acc_norm_stderr\": 0.012573836633799015\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5955882352941176,\n \"acc_stderr\": 0.02981263070156974,\n \"acc_norm\": 0.5955882352941176,\n \"acc_norm_stderr\": 0.02981263070156974\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.019910377463105932,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.019910377463105932\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322605,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322605\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764003,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764003\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42105263157894735,\n \"mc1_stderr\": 0.01728393624813649,\n \"mc2\": 0.5857820006375237,\n \"mc2_stderr\": 0.015441927798310004\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663595\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3502653525398029,\n \"acc_stderr\": 0.01314040945557127\n }\n}\n```", "repo_url": "https://huggingface.co/jefferylovely/AthenaImaniMaven", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|arc:challenge|25_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|arc:challenge|25_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|gsm8k|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|gsm8k|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hellaswag|10_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hellaswag|10_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|winogrande|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|winogrande|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["results_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["results_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T03-41-28.738425.parquet"]}]}]} | 2024-01-14T03:44:09+00:00 |
b09e943f99c75a78bfb9e9df14b7e73e53fb55c4 | wu981526092/MGSD_V2 | [
"license:mit",
"region:us"
] | 2024-01-14T03:24:45+00:00 | {"license": "mit"} | 2024-01-14T20:47:56+00:00 |
|
b8f3f70b1786faf553c058a987bddd31d3f992a6 |
# Dataset of felix_schultz/フィリックス・シュルツ/菲利克斯·舒尔茨 (Azur Lane)
This is the dataset of felix_schultz/フィリックス・シュルツ/菲利克斯·舒尔茨 (Azur Lane), containing 26 images and their tags.
The core tags of this character are `long_hair, purple_hair, red_eyes, twintails, breasts, very_long_hair, bangs, small_breasts, horns`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 26 | 58.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/felix_schultz_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 26 | 26.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/felix_schultz_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 63 | 54.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/felix_schultz_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 26 | 48.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/felix_schultz_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 63 | 87.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/felix_schultz_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/felix_schultz_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 26 |  |  |  |  |  | looking_at_viewer, 1girl, solo, bare_shoulders, elbow_gloves, navel, arms_up, open_mouth, revealing_clothes, armpits, black_gloves, blush, smile, thighs, black_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | 1girl | solo | bare_shoulders | elbow_gloves | navel | arms_up | open_mouth | revealing_clothes | armpits | black_gloves | blush | smile | thighs | black_dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:--------|:-------|:-----------------|:---------------|:--------|:----------|:-------------|:--------------------|:----------|:---------------|:--------|:--------|:---------|:--------------|
| 0 | 26 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/felix_schultz_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T03:25:15+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:32:10+00:00 |
ca7d5ae4638dcc3da858303901586f6f35d8f09c |
# Dataset of maryland/メリーランド/马里兰 (Azur Lane)
This is the dataset of maryland/メリーランド/马里兰 (Azur Lane), containing 16 images and their tags.
The core tags of this character are `long_hair, ponytail, red_eyes, red_hair, breasts, large_breasts, bangs, hair_between_eyes, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 16 | 15.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maryland_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 16 | 9.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maryland_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 36 | 18.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maryland_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 16 | 14.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maryland_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 36 | 25.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maryland_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/maryland_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, black_gloves, dress, thighhighs, cleavage, thigh_boots |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | solo | black_gloves | dress | thighhighs | cleavage | thigh_boots |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:---------------|:--------|:-------------|:-----------|:--------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
| CyberHarem/maryland_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T03:25:16+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:30:46+00:00 |
db1d416d9434fe0a1fd47b59589ba9edc84bdb6c | sdsam/testing | [
"region:us"
] | 2024-01-14T03:28:40+00:00 | {} | 2024-01-14T18:36:47+00:00 |
|
0e5d79fe5b9a85c8ce462fe45d20a96eb2d5287e | feng456/reuters_articles | [
"region:us"
] | 2024-01-14T03:36:43+00:00 | {"dataset_info": {"features": [{"name": "title", "dtype": "string"}, {"name": "body", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 13792576, "num_examples": 17262}, {"name": "validation", "num_bytes": 1870389, "num_examples": 2158}, {"name": "test", "num_bytes": 1379190, "num_examples": 2158}], "download_size": 10073414, "dataset_size": 17042155}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-14T03:36:58+00:00 |
|
ebd5b848efed4a654558e79103eb0a42a6efa945 |
# Dataset of p22/P22/P22 (Girls' Frontline)
This is the dataset of p22/P22/P22 (Girls' Frontline), containing 25 images and their tags.
The core tags of this character are `blue_eyes, short_hair, bangs, breasts, hair_between_eyes, black_hair, earrings, grey_hair, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 25 | 25.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p22_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 25 | 16.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p22_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 49 | 29.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p22_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 25 | 23.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p22_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 49 | 39.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p22_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/p22_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | looking_at_viewer, solo, 1girl, blue_jacket, cleavage, navel, black_shorts, black_thighhighs, blush, checkered_flag, fingerless_gloves, full_body, highleg_panties, race_queen, short_shorts, bikini, headset, high_heels, official_alternate_costume, sitting, thigh_boots, black_gloves, blue_panties, collarbone, cropped_jacket, holding_flag, open_clothes, smile |
| 1 | 18 |  |  |  |  |  | 1girl, solo, looking_at_viewer, jewelry, smile, bare_shoulders, jacket, closed_mouth, sleeveless, black_nails, handgun, holding_gun, long_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | solo | 1girl | blue_jacket | cleavage | navel | black_shorts | black_thighhighs | blush | checkered_flag | fingerless_gloves | full_body | highleg_panties | race_queen | short_shorts | bikini | headset | high_heels | official_alternate_costume | sitting | thigh_boots | black_gloves | blue_panties | collarbone | cropped_jacket | holding_flag | open_clothes | smile | jewelry | bare_shoulders | jacket | closed_mouth | sleeveless | black_nails | handgun | holding_gun | long_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:-------|:--------|:--------------|:-----------|:--------|:---------------|:-------------------|:--------|:-----------------|:--------------------|:------------|:------------------|:-------------|:---------------|:---------|:----------|:-------------|:-----------------------------|:----------|:--------------|:---------------|:---------------|:-------------|:-----------------|:---------------|:---------------|:--------|:----------|:-----------------|:---------|:---------------|:-------------|:--------------|:----------|:--------------|:---------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 18 |  |  |  |  |  | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/p22_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T03:46:54+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:53:40+00:00 |
ab3167b14a5a1eb824f41b7e0be241c8c5f071f3 |
# Dataset of gr_mg23/GrMG23/HK23 (Girls' Frontline)
This is the dataset of gr_mg23/GrMG23/HK23 (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are `breasts, double_bun, hair_bun, long_hair, blonde_hair, large_breasts, purple_eyes, bangs, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 14.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg23_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 7.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg23_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 30 | 17.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg23_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 12.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg23_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 30 | 26.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg23_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gr_mg23_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, solo, blush, gloves, looking_at_viewer, long_sleeves, open_mouth, white_background, black_skirt, black_thighhighs, pleated_skirt, black_jacket, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | gloves | looking_at_viewer | long_sleeves | open_mouth | white_background | black_skirt | black_thighhighs | pleated_skirt | black_jacket | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:---------|:--------------------|:---------------|:-------------|:-------------------|:--------------|:-------------------|:----------------|:---------------|:--------------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/gr_mg23_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T03:46:55+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:49:42+00:00 |
12a9af32d6f23aa0497025920dbb79c46e97deaf |
# Dataset Card for Evaluation run of dfurman/HermesBagel-34B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [dfurman/HermesBagel-34B-v0.1](https://huggingface.co/dfurman/HermesBagel-34B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dfurman__HermesBagel-34B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T03:53:56.861170](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__HermesBagel-34B-v0.1/blob/main/results_2024-01-14T03-53-56.861170.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7695763625614322,
"acc_stderr": 0.02793431209028075,
"acc_norm": 0.7740465788313311,
"acc_norm_stderr": 0.028460203996252778,
"mc1": 0.5006119951040392,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.673352473186811,
"mc2_stderr": 0.014617965588559495
},
"harness|arc:challenge|25": {
"acc": 0.6757679180887372,
"acc_stderr": 0.013678810399518822,
"acc_norm": 0.7056313993174061,
"acc_norm_stderr": 0.01331852846053942
},
"harness|hellaswag|10": {
"acc": 0.6638119896434973,
"acc_stderr": 0.004714386376337134,
"acc_norm": 0.8573989245170285,
"acc_norm_stderr": 0.003489509493001622
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.024974533450920697,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.024974533450920697
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372274,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.026355158413349414,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.026355158413349414
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7793103448275862,
"acc_stderr": 0.03455930201924813,
"acc_norm": 0.7793103448275862,
"acc_norm_stderr": 0.03455930201924813
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6957671957671958,
"acc_stderr": 0.02369541500946309,
"acc_norm": 0.6957671957671958,
"acc_norm_stderr": 0.02369541500946309
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9096774193548387,
"acc_stderr": 0.016306570644488313,
"acc_norm": 0.9096774193548387,
"acc_norm_stderr": 0.016306570644488313
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.01764652667723332,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.01764652667723332
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527033,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527033
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.01889552448260495,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.01889552448260495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.030296771286067323,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.030296771286067323
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8571428571428571,
"acc_stderr": 0.02273020811930654,
"acc_norm": 0.8571428571428571,
"acc_norm_stderr": 0.02273020811930654
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5298013245033113,
"acc_stderr": 0.040752249922169796,
"acc_norm": 0.5298013245033113,
"acc_norm_stderr": 0.040752249922169796
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9211009174311927,
"acc_stderr": 0.011558198113769584,
"acc_norm": 0.9211009174311927,
"acc_norm_stderr": 0.011558198113769584
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6620370370370371,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.6620370370370371,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9007633587786259,
"acc_stderr": 0.02622223517147737,
"acc_norm": 0.9007633587786259,
"acc_norm_stderr": 0.02622223517147737
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.027285246312758957,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.027285246312758957
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.030381596756651655,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.030381596756651655
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8773006134969326,
"acc_stderr": 0.025777328426978927,
"acc_norm": 0.8773006134969326,
"acc_norm_stderr": 0.025777328426978927
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331366,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331366
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673137,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9054916985951469,
"acc_stderr": 0.010461015338193071,
"acc_norm": 0.9054916985951469,
"acc_norm_stderr": 0.010461015338193071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8208092485549133,
"acc_stderr": 0.020647590029679332,
"acc_norm": 0.8208092485549133,
"acc_norm_stderr": 0.020647590029679332
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7977653631284917,
"acc_stderr": 0.013433729483320979,
"acc_norm": 0.7977653631284917,
"acc_norm_stderr": 0.013433729483320979
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8398692810457516,
"acc_stderr": 0.020998740930362303,
"acc_norm": 0.8398692810457516,
"acc_norm_stderr": 0.020998740930362303
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.819935691318328,
"acc_stderr": 0.02182342285774494,
"acc_norm": 0.819935691318328,
"acc_norm_stderr": 0.02182342285774494
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8827160493827161,
"acc_stderr": 0.017903112615281123,
"acc_norm": 0.8827160493827161,
"acc_norm_stderr": 0.017903112615281123
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6560283687943262,
"acc_stderr": 0.02833801742861133,
"acc_norm": 0.6560283687943262,
"acc_norm_stderr": 0.02833801742861133
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6003911342894394,
"acc_stderr": 0.01251018163696068,
"acc_norm": 0.6003911342894394,
"acc_norm_stderr": 0.01251018163696068
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8419117647058824,
"acc_stderr": 0.022161462608068522,
"acc_norm": 0.8419117647058824,
"acc_norm_stderr": 0.022161462608068522
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.015643069911273344,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.015643069911273344
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8571428571428571,
"acc_stderr": 0.022401787435256396,
"acc_norm": 0.8571428571428571,
"acc_norm_stderr": 0.022401787435256396
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824664,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824664
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646613,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646613
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5006119951040392,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.673352473186811,
"mc2_stderr": 0.014617965588559495
},
"harness|winogrande|5": {
"acc": 0.846093133385951,
"acc_stderr": 0.010141944523750028
},
"harness|gsm8k|5": {
"acc": 0.6527672479150872,
"acc_stderr": 0.013113898382146875
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_dfurman__HermesBagel-34B-v0.1 | [
"region:us"
] | 2024-01-14T03:56:07+00:00 | {"pretty_name": "Evaluation run of dfurman/HermesBagel-34B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [dfurman/HermesBagel-34B-v0.1](https://huggingface.co/dfurman/HermesBagel-34B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dfurman__HermesBagel-34B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T03:53:56.861170](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__HermesBagel-34B-v0.1/blob/main/results_2024-01-14T03-53-56.861170.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7695763625614322,\n \"acc_stderr\": 0.02793431209028075,\n \"acc_norm\": 0.7740465788313311,\n \"acc_norm_stderr\": 0.028460203996252778,\n \"mc1\": 0.5006119951040392,\n \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.673352473186811,\n \"mc2_stderr\": 0.014617965588559495\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6757679180887372,\n \"acc_stderr\": 0.013678810399518822,\n \"acc_norm\": 0.7056313993174061,\n \"acc_norm_stderr\": 0.01331852846053942\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6638119896434973,\n \"acc_stderr\": 0.004714386376337134,\n \"acc_norm\": 0.8573989245170285,\n \"acc_norm_stderr\": 0.003489509493001622\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.024974533450920697,\n \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.024974533450920697\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372274,\n \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372274\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.04951218252396262,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.04951218252396262\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.026355158413349414,\n \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.026355158413349414\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7793103448275862,\n \"acc_stderr\": 0.03455930201924813,\n \"acc_norm\": 0.7793103448275862,\n \"acc_norm_stderr\": 0.03455930201924813\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6957671957671958,\n \"acc_stderr\": 0.02369541500946309,\n \"acc_norm\": 0.6957671957671958,\n \"acc_norm_stderr\": 0.02369541500946309\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9096774193548387,\n \"acc_stderr\": 0.016306570644488313,\n \"acc_norm\": 0.9096774193548387,\n \"acc_norm_stderr\": 0.016306570644488313\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03344283744280458,\n \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03344283744280458\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9343434343434344,\n \"acc_stderr\": 0.01764652667723332,\n \"acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.01764652667723332\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.01889552448260495,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.01889552448260495\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.030296771286067323,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.030296771286067323\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8571428571428571,\n \"acc_stderr\": 0.02273020811930654,\n \"acc_norm\": 0.8571428571428571,\n \"acc_norm_stderr\": 0.02273020811930654\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5298013245033113,\n \"acc_stderr\": 0.040752249922169796,\n \"acc_norm\": 0.5298013245033113,\n \"acc_norm_stderr\": 0.040752249922169796\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9211009174311927,\n \"acc_stderr\": 0.011558198113769584,\n \"acc_norm\": 0.9211009174311927,\n \"acc_norm_stderr\": 0.011558198113769584\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6620370370370371,\n \"acc_stderr\": 0.03225941352631295,\n \"acc_norm\": 0.6620370370370371,\n \"acc_norm_stderr\": 0.03225941352631295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.9007633587786259,\n \"acc_stderr\": 0.02622223517147737,\n \"acc_norm\": 0.9007633587786259,\n \"acc_norm_stderr\": 0.02622223517147737\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.030381596756651655,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.030381596756651655\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8773006134969326,\n \"acc_stderr\": 0.025777328426978927,\n \"acc_norm\": 0.8773006134969326,\n \"acc_norm_stderr\": 0.025777328426978927\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331366,\n \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331366\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n \"acc_stderr\": 0.016046261631673137,\n \"acc_norm\": 0.9358974358974359,\n \"acc_norm_stderr\": 0.016046261631673137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9054916985951469,\n \"acc_stderr\": 0.010461015338193071,\n \"acc_norm\": 0.9054916985951469,\n \"acc_norm_stderr\": 0.010461015338193071\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7977653631284917,\n \"acc_stderr\": 0.013433729483320979,\n \"acc_norm\": 0.7977653631284917,\n \"acc_norm_stderr\": 0.013433729483320979\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8398692810457516,\n \"acc_stderr\": 0.020998740930362303,\n \"acc_norm\": 0.8398692810457516,\n \"acc_norm_stderr\": 0.020998740930362303\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.819935691318328,\n \"acc_stderr\": 0.02182342285774494,\n \"acc_norm\": 0.819935691318328,\n \"acc_norm_stderr\": 0.02182342285774494\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8827160493827161,\n \"acc_stderr\": 0.017903112615281123,\n \"acc_norm\": 0.8827160493827161,\n \"acc_norm_stderr\": 0.017903112615281123\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6560283687943262,\n \"acc_stderr\": 0.02833801742861133,\n \"acc_norm\": 0.6560283687943262,\n \"acc_norm_stderr\": 0.02833801742861133\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6003911342894394,\n \"acc_stderr\": 0.01251018163696068,\n \"acc_norm\": 0.6003911342894394,\n \"acc_norm_stderr\": 0.01251018163696068\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8419117647058824,\n \"acc_stderr\": 0.022161462608068522,\n \"acc_norm\": 0.8419117647058824,\n \"acc_norm_stderr\": 0.022161462608068522\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.015643069911273344,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.015643069911273344\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8571428571428571,\n \"acc_stderr\": 0.022401787435256396,\n \"acc_norm\": 0.8571428571428571,\n \"acc_norm_stderr\": 0.022401787435256396\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.022076326101824664,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.022076326101824664\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646613,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646613\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5006119951040392,\n \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.673352473186811,\n \"mc2_stderr\": 0.014617965588559495\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.846093133385951,\n \"acc_stderr\": 0.010141944523750028\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6527672479150872,\n \"acc_stderr\": 0.013113898382146875\n }\n}\n```", "repo_url": "https://huggingface.co/dfurman/HermesBagel-34B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|arc:challenge|25_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|gsm8k|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hellaswag|10_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|winogrande|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["results_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T03-53-56.861170.parquet"]}]}]} | 2024-01-14T03:56:27+00:00 |
0dafc248bfd177e774c34ea45b9b70abb392f387 |
# AutoLamella Dataset
The autolamella dataset consists of images from multiple different lamella preparation methods. All data is annotated for semantic segmentation, and is available through the huggingface api at [patrickcleeve/autolamella](https://huggingface.co/datasets/patrickcleeve/autolamella)
Summary
| Dataset / Method | Train | Test | Total |
| ----------- | ----------- | -----------| -----------|
| Waffle | 214 | 76 | 290 |
| Liftout | 801 | 163 | 969 |
| Serial Liftout | 301 | 109 | 412 |
| **Full** | **1316** | **348** | **1664** |
Details about the datasets can be found in summary.csv in the dataset directory.
### Labels
Currently, the dataset is labelled for the following classes. In the future, we will add additional labels for objects such as ice contamination. If you would like to label this data, please see the labelling tools to get started.
```yaml
CLASS_LABELS: # autolamella
0: "background"
1: "lamella"
2: "manipulator"
3: "landing_post"
4: "copper_adaptor"
5: "volume_block"
```
## Download Datasets
To download datasets, you can use the huggingface api:
```python
from datasets import load_dataset
# download waffle dataset
ds = load_dataset("patrickcleeve/autolamella", name="waffle")
# download liftout dataset
ds = load_dataset("patrickcleeve/autolamella", name="liftout")
# download serial-liftout dataset
ds = load_dataset("patrickcleeve/autolamella", name="serial-liftout")
# download test split only
ds = load_dataset("patrickcleeve/autolamella", name="waffle", split="test")
```
To display images and annotations:
```python
# show random image image and annotation (training split)
import random
import numpy as np
import matplotlib.pyplot as plt
from fibsem.segmentation.utils import decode_segmap_v2
# random data
idx = random.randint(0, len(ds["train"]))
image = np.asarray(ds["train"][idx]["image"])
mask = np.asarray(ds["train"][idx]["annotation"])
# metadata
split = ds["train"].split
config_name = ds["train"].config_name
plt.title(f"{config_name}-{split}-{idx:02d}")
plt.imshow(image, cmap="gray", alpha=0.7)
plt.imshow(decode_segmap_v2(mask), alpha=0.3)
plt.axis("off")
plt.show()
```
| Waffle | Liftout | Serial Liftout |
| ----------- | ----------- | ----------- |
|  |  |  |
You can also concatenate the datasets together into a single dataset for easy combined training (e.g. mega models)
```python
from datasets import load_dataset, concatenate_datasets
# load invidual datasets
waffle_train_ds = load_dataset("patrickcleeve/autolamella", name="waffle", split="train")
liftout_train_ds = load_dataset("patrickcleeve/autolamella", name="liftout", split="train")
serial_liftout_train_ds = load_dataset("patrickcleeve/autolamella", name="serial-liftout", split="train")
# concatenate datasets (e.g. mega model)
train_ds = concatenate_datasets([waffle_train_ds, liftout_train_ds, serial_liftout_train_ds])
print(train_ds)
```
```yaml
Dataset({
features: ['image', 'annotation'],
num_rows: 1316
})
```
### Acknowledgement
- Waffle and Liftout data from Monash
- Serial Liftout data from MPI
| patrickcleeve/autolamella | [
"license:mit",
"region:us"
] | 2024-01-14T03:59:29+00:00 | {"license": "mit", "dataset_info": [{"config_name": "liftout", "features": [{"name": "image", "dtype": "image"}, {"name": "annotation", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 2479679335.0, "num_examples": 801}, {"name": "test", "num_bytes": 514295427.0, "num_examples": 163}], "download_size": 1540632118, "dataset_size": 2993974762.0}, {"config_name": "serial-liftout", "features": [{"name": "image", "dtype": "image"}, {"name": "annotation", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 946980390.0, "num_examples": 301}, {"name": "test", "num_bytes": 342926454.0, "num_examples": 109}], "download_size": 457168711, "dataset_size": 1289906844.0}, {"config_name": "waffle", "features": [{"name": "image", "dtype": "image"}, {"name": "annotation", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 673435138.0, "num_examples": 214}, {"name": "test", "num_bytes": 239208412.0, "num_examples": 76}], "download_size": 477754123, "dataset_size": 912643550.0}], "configs": [{"config_name": "liftout", "data_files": [{"split": "train", "path": "liftout/train-*"}, {"split": "test", "path": "liftout/test-*"}]}, {"config_name": "serial-liftout", "data_files": [{"split": "train", "path": "serial-liftout/train-*"}, {"split": "test", "path": "serial-liftout/test-*"}]}, {"config_name": "waffle", "data_files": [{"split": "train", "path": "waffle/train-*"}, {"split": "test", "path": "waffle/test-*"}]}]} | 2024-01-21T10:49:41+00:00 |
4f4ca8330f298715860d1dab30428d0d61c98a81 | AlexDom/TSA | [
"region:us"
] | 2024-01-14T04:09:37+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "inputs", "struct": [{"name": "text", "dtype": "string"}]}, {"name": "prediction", "list": [{"name": "label", "dtype": "string"}, {"name": "score", "dtype": "float64"}]}, {"name": "prediction_agent", "dtype": "string"}, {"name": "annotation", "dtype": "null"}, {"name": "annotation_agent", "dtype": "null"}, {"name": "multi_label", "dtype": "bool"}, {"name": "explanation", "dtype": "null"}, {"name": "id", "dtype": "null"}, {"name": "metadata", "struct": [{"name": "category", "dtype": "int64"}]}, {"name": "status", "dtype": "string"}, {"name": "event_timestamp", "dtype": "null"}, {"name": "metrics", "dtype": "null"}], "splits": [{"name": "train", "num_bytes": 1205760, "num_examples": 5001}], "download_size": 447577, "dataset_size": 1205760}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T17:59:22+00:00 |
|
d68526faf7251d13cad8f09216d1ff278528abba | Hunterlige/code_civil | [
"multilinguality:monolingual",
"source_datasets:original",
"language:fr",
"license:mit",
"region:us"
] | 2024-01-14T04:22:59+00:00 | {"language": ["fr"], "license": "mit", "multilinguality": ["monolingual"], "source_datasets": ["original"], "pretty_name": "Code civil"} | 2024-01-14T04:22:59+00:00 |
|
ef8e2a527e2f3cd53a649c98858122bae2f428ed |
# Dataset of cx4_storm/Cx4ストーム/Cx4风暴 (Girls' Frontline)
This is the dataset of cx4_storm/Cx4ストーム/Cx4风暴 (Girls' Frontline), containing 15 images and their tags.
The core tags of this character are `black_hair, long_hair, bow, breasts, red_eyes, hair_bow, red_bow, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 15.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cx4_storm_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 9.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cx4_storm_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 33 | 18.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cx4_storm_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 13.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cx4_storm_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 33 | 24.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cx4_storm_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/cx4_storm_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, navel, open_mouth, simple_background, black_panties, black_thighhighs, garter_straps, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | navel | open_mouth | simple_background | black_panties | black_thighhighs | garter_straps | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:--------|:-------------|:--------------------|:----------------|:-------------------|:----------------|:-------------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/cx4_storm_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T04:23:58+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T04:26:37+00:00 |
ee2c899c5eae52ae85cb929ad5a08c407956c584 |
# Dataset Card for Evaluation run of Weyaxi/Bagel-Hermes-2x34B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Bagel-Hermes-2x34B](https://huggingface.co/Weyaxi/Bagel-Hermes-2x34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-2x34B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T04:24:57.713282](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-2x34B/blob/main/results_2024-01-14T04-24-57.713282.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7687937231792787,
"acc_stderr": 0.027887592122908762,
"acc_norm": 0.7725082714288936,
"acc_norm_stderr": 0.028420468097469523,
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961574,
"mc2": 0.6482085164957936,
"mc2_stderr": 0.01484519519589757
},
"harness|arc:challenge|25": {
"acc": 0.6749146757679181,
"acc_stderr": 0.013688147309729119,
"acc_norm": 0.6979522184300341,
"acc_norm_stderr": 0.013417519144716417
},
"harness|hellaswag|10": {
"acc": 0.6595299741087433,
"acc_stderr": 0.004728988167338544,
"acc_norm": 0.8526190001991635,
"acc_norm_stderr": 0.0035376085010691773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066652,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066652
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9013157894736842,
"acc_stderr": 0.02427022773752271,
"acc_norm": 0.9013157894736842,
"acc_norm_stderr": 0.02427022773752271
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8037735849056604,
"acc_stderr": 0.024442388131100806,
"acc_norm": 0.8037735849056604,
"acc_norm_stderr": 0.024442388131100806
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.02635515841334941,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.02635515841334941
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7793103448275862,
"acc_stderr": 0.03455930201924813,
"acc_norm": 0.7793103448275862,
"acc_norm_stderr": 0.03455930201924813
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02351729433596328,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02351729433596328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5793650793650794,
"acc_stderr": 0.04415438226743745,
"acc_norm": 0.5793650793650794,
"acc_norm_stderr": 0.04415438226743745
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9032258064516129,
"acc_stderr": 0.016818943416345197,
"acc_norm": 0.9032258064516129,
"acc_norm_stderr": 0.016818943416345197
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6403940886699507,
"acc_stderr": 0.03376458246509567,
"acc_norm": 0.6403940886699507,
"acc_norm_stderr": 0.03376458246509567
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.025485498373343237,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.025485498373343237
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9393939393939394,
"acc_stderr": 0.01699999492742161,
"acc_norm": 0.9393939393939394,
"acc_norm_stderr": 0.01699999492742161
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695318,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8179487179487179,
"acc_stderr": 0.019565236782930887,
"acc_norm": 0.8179487179487179,
"acc_norm_stderr": 0.019565236782930887
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4185185185185185,
"acc_stderr": 0.030078013075022055,
"acc_norm": 0.4185185185185185,
"acc_norm_stderr": 0.030078013075022055
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8613445378151261,
"acc_stderr": 0.02244826447683259,
"acc_norm": 0.8613445378151261,
"acc_norm_stderr": 0.02244826447683259
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4966887417218543,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.4966887417218543,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9100917431192661,
"acc_stderr": 0.012264304540230446,
"acc_norm": 0.9100917431192661,
"acc_norm_stderr": 0.012264304540230446
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6898148148148148,
"acc_stderr": 0.03154696285656629,
"acc_norm": 0.6898148148148148,
"acc_norm_stderr": 0.03154696285656629
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.02034340073486884,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.02034340073486884
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280226,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280226
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342337,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342337
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540627,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540627
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563275,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.02684576505455385,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.02684576505455385
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6071428571428571,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.6071428571428571,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.9029126213592233,
"acc_stderr": 0.02931596291881348,
"acc_norm": 0.9029126213592233,
"acc_norm_stderr": 0.02931596291881348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.01700436856813234,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.01700436856813234
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.909323116219668,
"acc_stderr": 0.010268429662528547,
"acc_norm": 0.909323116219668,
"acc_norm_stderr": 0.010268429662528547
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8265895953757225,
"acc_stderr": 0.020383229551135033,
"acc_norm": 0.8265895953757225,
"acc_norm_stderr": 0.020383229551135033
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7899441340782123,
"acc_stderr": 0.013623755371333533,
"acc_norm": 0.7899441340782123,
"acc_norm_stderr": 0.013623755371333533
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.01970403918385981,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.01970403918385981
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8327974276527331,
"acc_stderr": 0.021193872528034962,
"acc_norm": 0.8327974276527331,
"acc_norm_stderr": 0.021193872528034962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8919753086419753,
"acc_stderr": 0.01727176308448352,
"acc_norm": 0.8919753086419753,
"acc_norm_stderr": 0.01727176308448352
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6595744680851063,
"acc_stderr": 0.028267657482650158,
"acc_norm": 0.6595744680851063,
"acc_norm_stderr": 0.028267657482650158
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6069100391134289,
"acc_stderr": 0.012474899613873955,
"acc_norm": 0.6069100391134289,
"acc_norm_stderr": 0.012474899613873955
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8455882352941176,
"acc_stderr": 0.021950024722922033,
"acc_norm": 0.8455882352941176,
"acc_norm_stderr": 0.021950024722922033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.01553374508338279,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.01553374508338279
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8489795918367347,
"acc_stderr": 0.022923004094736847,
"acc_norm": 0.8489795918367347,
"acc_norm_stderr": 0.022923004094736847
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659393,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659393
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.02464806896136616,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.02464806896136616
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961574,
"mc2": 0.6482085164957936,
"mc2_stderr": 0.01484519519589757
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.010099208246065609
},
"harness|gsm8k|5": {
"acc": 0.6868840030326004,
"acc_stderr": 0.012774285669385096
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-2x34b | [
"region:us"
] | 2024-01-14T04:27:11+00:00 | {"pretty_name": "Evaluation run of Weyaxi/Bagel-Hermes-2x34B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Bagel-Hermes-2x34B](https://huggingface.co/Weyaxi/Bagel-Hermes-2x34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-2x34B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T04:24:57.713282](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-2x34B/blob/main/results_2024-01-14T04-24-57.713282.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7687937231792787,\n \"acc_stderr\": 0.027887592122908762,\n \"acc_norm\": 0.7725082714288936,\n \"acc_norm_stderr\": 0.028420468097469523,\n \"mc1\": 0.47613219094247244,\n \"mc1_stderr\": 0.017483547156961574,\n \"mc2\": 0.6482085164957936,\n \"mc2_stderr\": 0.01484519519589757\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6749146757679181,\n \"acc_stderr\": 0.013688147309729119,\n \"acc_norm\": 0.6979522184300341,\n \"acc_norm_stderr\": 0.013417519144716417\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6595299741087433,\n \"acc_stderr\": 0.004728988167338544,\n \"acc_norm\": 0.8526190001991635,\n \"acc_norm_stderr\": 0.0035376085010691773\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066652,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066652\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9013157894736842,\n \"acc_stderr\": 0.02427022773752271,\n \"acc_norm\": 0.9013157894736842,\n \"acc_norm_stderr\": 0.02427022773752271\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8037735849056604,\n \"acc_stderr\": 0.024442388131100806,\n \"acc_norm\": 0.8037735849056604,\n \"acc_norm_stderr\": 0.024442388131100806\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.02635515841334941,\n \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.02635515841334941\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7793103448275862,\n \"acc_stderr\": 0.03455930201924813,\n \"acc_norm\": 0.7793103448275862,\n \"acc_norm_stderr\": 0.03455930201924813\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02351729433596328,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02351729433596328\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5793650793650794,\n \"acc_stderr\": 0.04415438226743745,\n \"acc_norm\": 0.5793650793650794,\n \"acc_norm_stderr\": 0.04415438226743745\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9032258064516129,\n \"acc_stderr\": 0.016818943416345197,\n \"acc_norm\": 0.9032258064516129,\n \"acc_norm_stderr\": 0.016818943416345197\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6403940886699507,\n \"acc_stderr\": 0.03376458246509567,\n \"acc_norm\": 0.6403940886699507,\n \"acc_norm_stderr\": 0.03376458246509567\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.025485498373343237,\n \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.025485498373343237\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9393939393939394,\n \"acc_stderr\": 0.01699999492742161,\n \"acc_norm\": 0.9393939393939394,\n \"acc_norm_stderr\": 0.01699999492742161\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8179487179487179,\n \"acc_stderr\": 0.019565236782930887,\n \"acc_norm\": 0.8179487179487179,\n \"acc_norm_stderr\": 0.019565236782930887\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4185185185185185,\n \"acc_stderr\": 0.030078013075022055,\n \"acc_norm\": 0.4185185185185185,\n \"acc_norm_stderr\": 0.030078013075022055\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8613445378151261,\n \"acc_stderr\": 0.02244826447683259,\n \"acc_norm\": 0.8613445378151261,\n \"acc_norm_stderr\": 0.02244826447683259\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449654,\n \"acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449654\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9100917431192661,\n \"acc_stderr\": 0.012264304540230446,\n \"acc_norm\": 0.9100917431192661,\n \"acc_norm_stderr\": 0.012264304540230446\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6898148148148148,\n \"acc_stderr\": 0.03154696285656629,\n \"acc_norm\": 0.6898148148148148,\n \"acc_norm_stderr\": 0.03154696285656629\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073322,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073322\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.890295358649789,\n \"acc_stderr\": 0.02034340073486884,\n \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.02034340073486884\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n \"acc_stderr\": 0.02693611191280226,\n \"acc_norm\": 0.7982062780269058,\n \"acc_norm_stderr\": 0.02693611191280226\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342337,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342337\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540627,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540627\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.02923927267563275,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.02923927267563275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.02684576505455385,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.02684576505455385\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6071428571428571,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.6071428571428571,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9029126213592233,\n \"acc_stderr\": 0.02931596291881348,\n \"acc_norm\": 0.9029126213592233,\n \"acc_norm_stderr\": 0.02931596291881348\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n \"acc_stderr\": 0.01700436856813234,\n \"acc_norm\": 0.9273504273504274,\n \"acc_norm_stderr\": 0.01700436856813234\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.909323116219668,\n \"acc_stderr\": 0.010268429662528547,\n \"acc_norm\": 0.909323116219668,\n \"acc_norm_stderr\": 0.010268429662528547\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8265895953757225,\n \"acc_stderr\": 0.020383229551135033,\n \"acc_norm\": 0.8265895953757225,\n \"acc_norm_stderr\": 0.020383229551135033\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7899441340782123,\n \"acc_stderr\": 0.013623755371333533,\n \"acc_norm\": 0.7899441340782123,\n \"acc_norm_stderr\": 0.013623755371333533\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.01970403918385981,\n \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.01970403918385981\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8327974276527331,\n \"acc_stderr\": 0.021193872528034962,\n \"acc_norm\": 0.8327974276527331,\n \"acc_norm_stderr\": 0.021193872528034962\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8919753086419753,\n \"acc_stderr\": 0.01727176308448352,\n \"acc_norm\": 0.8919753086419753,\n \"acc_norm_stderr\": 0.01727176308448352\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6595744680851063,\n \"acc_stderr\": 0.028267657482650158,\n \"acc_norm\": 0.6595744680851063,\n \"acc_norm_stderr\": 0.028267657482650158\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6069100391134289,\n \"acc_stderr\": 0.012474899613873955,\n \"acc_norm\": 0.6069100391134289,\n \"acc_norm_stderr\": 0.012474899613873955\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8455882352941176,\n \"acc_stderr\": 0.021950024722922033,\n \"acc_norm\": 0.8455882352941176,\n \"acc_norm_stderr\": 0.021950024722922033\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8202614379084967,\n \"acc_stderr\": 0.01553374508338279,\n \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.01553374508338279\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8489795918367347,\n \"acc_stderr\": 0.022923004094736847,\n \"acc_norm\": 0.8489795918367347,\n \"acc_norm_stderr\": 0.022923004094736847\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.021166216304659393,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.021166216304659393\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.02464806896136616,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.02464806896136616\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47613219094247244,\n \"mc1_stderr\": 0.017483547156961574,\n \"mc2\": 0.6482085164957936,\n \"mc2_stderr\": 0.01484519519589757\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.010099208246065609\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6868840030326004,\n \"acc_stderr\": 0.012774285669385096\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Bagel-Hermes-2x34B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|arc:challenge|25_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|gsm8k|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hellaswag|10_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|winogrande|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["results_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T04-24-57.713282.parquet"]}]}]} | 2024-01-25T08:33:43+00:00 |
5396224ec3ecc97f401f5530b217b70d9f5910f8 | photonmz/blackjack-gpt | [
"region:us"
] | 2024-01-14T04:40:05+00:00 | {"dataset_info": {"features": [{"name": "prompt", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "label", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 967206.6489043825, "num_examples": 1807}, {"name": "test", "num_bytes": 107586.35109561753, "num_examples": 201}], "download_size": 21367, "dataset_size": 1074793.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-14T05:03:06+00:00 |
|
78a9d79266e62e1fa16775b7d6d32b09b26be756 |
# Dataset of pzb39/PzB39/PzB39 (Girls' Frontline)
This is the dataset of pzb39/PzB39/PzB39 (Girls' Frontline), containing 11 images and their tags.
The core tags of this character are `black_hair, breasts, long_hair, bangs, red_eyes, very_long_hair, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 12.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pzb39_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 7.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pzb39_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 19 | 12.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pzb39_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 10.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pzb39_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 19 | 18.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pzb39_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pzb39_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, solo, gloves, holding, smile, simple_background, white_background, arm_tattoo, bare_shoulders, black_jacket, black_necktie, black_pants, closed_mouth, ground_vehicle, gun, headwear_removed, long_sleeves, motorcycle, red_choker, uniform |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | gloves | holding | smile | simple_background | white_background | arm_tattoo | bare_shoulders | black_jacket | black_necktie | black_pants | closed_mouth | ground_vehicle | gun | headwear_removed | long_sleeves | motorcycle | red_choker | uniform |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------|:----------|:--------|:--------------------|:-------------------|:-------------|:-----------------|:---------------|:----------------|:--------------|:---------------|:-----------------|:------|:-------------------|:---------------|:-------------|:-------------|:----------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/pzb39_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T04:46:44+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T04:49:46+00:00 |
785109254fdc40dd7e1ee0687417568ee043fa61 |
# Dataset of gr_psg_1/GrPSG-1/PSG-1 (Girls' Frontline)
This is the dataset of gr_psg_1/GrPSG-1/PSG-1 (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are `breasts, ponytail, long_hair, hair_ornament, grey_eyes, white_hair, bangs, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 8.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_psg_1_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 6.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_psg_1_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 25 | 12.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_psg_1_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 8.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_psg_1_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 25 | 14.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_psg_1_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gr_psg_1_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, jacket, sniper_rifle, bikini_top_only, full_body, black_pantyhose, closed_mouth, front-tie_top, black_bikini, black_footwear, cleavage, collarbone, navel, open_clothes, scope, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | jacket | sniper_rifle | bikini_top_only | full_body | black_pantyhose | closed_mouth | front-tie_top | black_bikini | black_footwear | cleavage | collarbone | navel | open_clothes | scope | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------|:---------------|:------------------|:------------|:------------------|:---------------|:----------------|:---------------|:-----------------|:-----------|:-------------|:--------|:---------------|:--------|:--------------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/gr_psg_1_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T04:46:46+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T04:49:19+00:00 |
8a7be6a2da9fd5e94aecf35398ef269f7ccf891e |
# Dataset Card for Evaluation run of harborwater/dpo-test-hermes-open-llama-3b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [harborwater/dpo-test-hermes-open-llama-3b](https://huggingface.co/harborwater/dpo-test-hermes-open-llama-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_harborwater__dpo-test-hermes-open-llama-3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T04:56:07.071188](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__dpo-test-hermes-open-llama-3b/blob/main/results_2024-01-14T04-56-07.071188.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2514093021467422,
"acc_stderr": 0.03052650097964464,
"acc_norm": 0.25202173312622367,
"acc_norm_stderr": 0.03127688845727799,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520672,
"mc2": 0.3980562710501165,
"mc2_stderr": 0.014269053798319005
},
"harness|arc:challenge|25": {
"acc": 0.36689419795221845,
"acc_stderr": 0.014084133118104292,
"acc_norm": 0.3924914675767918,
"acc_norm_stderr": 0.014269634635670712
},
"harness|hellaswag|10": {
"acc": 0.5091615216092412,
"acc_stderr": 0.004988943721711217,
"acc_norm": 0.6745668193586934,
"acc_norm_stderr": 0.004675789156977649
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.03547854198560824,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.03547854198560824
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123415,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123415
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22641509433962265,
"acc_stderr": 0.02575755989310675,
"acc_norm": 0.22641509433962265,
"acc_norm_stderr": 0.02575755989310675
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641145,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641145
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28085106382978725,
"acc_stderr": 0.02937917046412482,
"acc_norm": 0.28085106382978725,
"acc_norm_stderr": 0.02937917046412482
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.020940481565334866,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.020940481565334866
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471276,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471276
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18387096774193548,
"acc_stderr": 0.022037217340267833,
"acc_norm": 0.18387096774193548,
"acc_norm_stderr": 0.022037217340267833
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18719211822660098,
"acc_stderr": 0.027444924966882618,
"acc_norm": 0.18719211822660098,
"acc_norm_stderr": 0.027444924966882618
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206824,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206824
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.027479603010538783,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.027479603010538783
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2282051282051282,
"acc_stderr": 0.02127839386358628,
"acc_norm": 0.2282051282051282,
"acc_norm_stderr": 0.02127839386358628
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652148,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652148
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21467889908256882,
"acc_stderr": 0.017604304149256494,
"acc_norm": 0.21467889908256882,
"acc_norm_stderr": 0.017604304149256494
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.02649191472735516,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.02649191472735516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.02904133351059804,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.02904133351059804
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34977578475336324,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.34977578475336324,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.03322015795776741,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.03322015795776741
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2554278416347382,
"acc_stderr": 0.015594955384455766,
"acc_norm": 0.2554278416347382,
"acc_norm_stderr": 0.015594955384455766
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.02335736578587404,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.02335736578587404
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22905027932960895,
"acc_stderr": 0.01405431493561456,
"acc_norm": 0.22905027932960895,
"acc_norm_stderr": 0.01405431493561456
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.023152722439402303,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.023152722439402303
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24758842443729903,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.24758842443729903,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24382716049382716,
"acc_stderr": 0.023891879541959614,
"acc_norm": 0.24382716049382716,
"acc_norm_stderr": 0.023891879541959614
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.02512373922687241,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.02512373922687241
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24641460234680573,
"acc_stderr": 0.01100597139992724,
"acc_norm": 0.24641460234680573,
"acc_norm_stderr": 0.01100597139992724
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20955882352941177,
"acc_stderr": 0.02472311040767705,
"acc_norm": 0.20955882352941177,
"acc_norm_stderr": 0.02472311040767705
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528037,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528037
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2,
"acc_stderr": 0.025607375986579153,
"acc_norm": 0.2,
"acc_norm_stderr": 0.025607375986579153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370519,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370519
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30994152046783624,
"acc_stderr": 0.03546976959393163,
"acc_norm": 0.30994152046783624,
"acc_norm_stderr": 0.03546976959393163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520672,
"mc2": 0.3980562710501165,
"mc2_stderr": 0.014269053798319005
},
"harness|winogrande|5": {
"acc": 0.6440410418310971,
"acc_stderr": 0.01345674065627396
},
"harness|gsm8k|5": {
"acc": 0.013646702047005308,
"acc_stderr": 0.003195747075480815
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_harborwater__dpo-test-hermes-open-llama-3b | [
"region:us"
] | 2024-01-14T04:57:52+00:00 | {"pretty_name": "Evaluation run of harborwater/dpo-test-hermes-open-llama-3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [harborwater/dpo-test-hermes-open-llama-3b](https://huggingface.co/harborwater/dpo-test-hermes-open-llama-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_harborwater__dpo-test-hermes-open-llama-3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T04:56:07.071188](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__dpo-test-hermes-open-llama-3b/blob/main/results_2024-01-14T04-56-07.071188.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2514093021467422,\n \"acc_stderr\": 0.03052650097964464,\n \"acc_norm\": 0.25202173312622367,\n \"acc_norm_stderr\": 0.03127688845727799,\n \"mc1\": 0.2484700122399021,\n \"mc1_stderr\": 0.015127427096520672,\n \"mc2\": 0.3980562710501165,\n \"mc2_stderr\": 0.014269053798319005\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.36689419795221845,\n \"acc_stderr\": 0.014084133118104292,\n \"acc_norm\": 0.3924914675767918,\n \"acc_norm_stderr\": 0.014269634635670712\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5091615216092412,\n \"acc_stderr\": 0.004988943721711217,\n \"acc_norm\": 0.6745668193586934,\n \"acc_norm_stderr\": 0.004675789156977649\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.21481481481481482,\n \"acc_stderr\": 0.03547854198560824,\n \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.03547854198560824\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123415,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123415\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.22641509433962265,\n \"acc_stderr\": 0.02575755989310675,\n \"acc_norm\": 0.22641509433962265,\n \"acc_norm_stderr\": 0.02575755989310675\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.03186209851641145,\n \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.03186209851641145\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.02937917046412482,\n \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.02937917046412482\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.020940481565334866,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.020940481565334866\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.040735243221471276,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.040735243221471276\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.18387096774193548,\n \"acc_stderr\": 0.022037217340267833,\n \"acc_norm\": 0.18387096774193548,\n \"acc_norm_stderr\": 0.022037217340267833\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.18719211822660098,\n \"acc_stderr\": 0.027444924966882618,\n \"acc_norm\": 0.18719211822660098,\n \"acc_norm_stderr\": 0.027444924966882618\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206824,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206824\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603489,\n \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603489\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.18181818181818182,\n \"acc_stderr\": 0.027479603010538783,\n \"acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.027479603010538783\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.029519282616817234,\n \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.029519282616817234\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2282051282051282,\n \"acc_stderr\": 0.02127839386358628,\n \"acc_norm\": 0.2282051282051282,\n \"acc_norm_stderr\": 0.02127839386358628\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.21467889908256882,\n \"acc_stderr\": 0.017604304149256494,\n \"acc_norm\": 0.21467889908256882,\n \"acc_norm_stderr\": 0.017604304149256494\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.02649191472735516,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.02649191472735516\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.02904133351059804,\n \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.02904133351059804\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34977578475336324,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.34977578475336324,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.03322015795776741,\n \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.03322015795776741\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2554278416347382,\n \"acc_stderr\": 0.015594955384455766,\n \"acc_norm\": 0.2554278416347382,\n \"acc_norm_stderr\": 0.015594955384455766\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.02335736578587404,\n \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.02335736578587404\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22905027932960895,\n \"acc_stderr\": 0.01405431493561456,\n \"acc_norm\": 0.22905027932960895,\n \"acc_norm_stderr\": 0.01405431493561456\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.023152722439402303,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.023152722439402303\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24758842443729903,\n \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.24758842443729903,\n \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.24382716049382716,\n \"acc_stderr\": 0.023891879541959614,\n \"acc_norm\": 0.24382716049382716,\n \"acc_norm_stderr\": 0.023891879541959614\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23049645390070922,\n \"acc_stderr\": 0.02512373922687241,\n \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.02512373922687241\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24641460234680573,\n \"acc_stderr\": 0.01100597139992724,\n \"acc_norm\": 0.24641460234680573,\n \"acc_norm_stderr\": 0.01100597139992724\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.20955882352941177,\n \"acc_stderr\": 0.02472311040767705,\n \"acc_norm\": 0.20955882352941177,\n \"acc_norm_stderr\": 0.02472311040767705\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528037,\n \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528037\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.025607375986579153,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.025607375986579153\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.30994152046783624,\n \"acc_stderr\": 0.03546976959393163,\n \"acc_norm\": 0.30994152046783624,\n \"acc_norm_stderr\": 0.03546976959393163\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n \"mc1_stderr\": 0.015127427096520672,\n \"mc2\": 0.3980562710501165,\n \"mc2_stderr\": 0.014269053798319005\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6440410418310971,\n \"acc_stderr\": 0.01345674065627396\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.013646702047005308,\n \"acc_stderr\": 0.003195747075480815\n }\n}\n```", "repo_url": "https://huggingface.co/harborwater/dpo-test-hermes-open-llama-3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|arc:challenge|25_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|gsm8k|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hellaswag|10_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|winogrande|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["results_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T04-56-07.071188.parquet"]}]}]} | 2024-01-14T04:58:12+00:00 |
7cf8aa676fba15e333eb7ba1d016e2605dfc737b |
# Singlish to English 🇸🇬
> Singapore is known for its efficiency and Singlish is no different - it's colourful and snappy. - [Tessa Wong, BBC News, 2015](https://www.bbc.com/news/magazine-33809914)
This is a synthetic dataset generated by GPT-4.
Each json pair contains one Singlish sentence about an everyday activity (e.g. cooking) and its English translation.
# Sample entry
```json
singlish: "Eh, chop the garlic - you can a not?",
english: Hey, do you know how to chop the garlic?"
```
# Data Generation Code
```python
import json
import pandas as pd
from openai import OpenAI
client = OpenAI()
NUM_SAMPLE = 10
ACTIVITIES = ['cooking',
'studying',
'sleeping',
'eating',
'working',
'exercising',
'reading',
'cleaning',
'shopping',
'driving',
'walking',
'bathing',
'going to work',
'listening to music',
'watching TV',
'playing video games',
'using a computer',
'texting',
'socializing',
'meditating',
'commuting',
'doing laundry',
'ironing clothes',
'dusting',
'vacuuming',
'painting',
'drawing',
'grocery shopping',
'sewing',
'taking a nap',
'jogging',
'biking',
'swimming',
'playing sports',
'checking emails',
'playing with children',
'watching movies',
'playing board games',
'attending school or classes',
'going to the gym',
'playing a musical instrument',
'singing',
'dancing',
'writing',
'photography',
'traveling',
'visiting friends',
'attending events',
'volunteering',
'attending meetings']
dataset = {}
for index, activity in enumerate(ACTIVITIES):
print(index, activity)
response = client.chat.completions.create(
model="gpt-4-1106-preview",
messages=[{"role": "system",
"content": "You are an expert in translating Singlish to English"},
{"role": "user",
"content": f"Create {NUM_SAMPLE} random Singlish (s) to English (e) translation pairs in json. Write full sentences about {activity}."\
f"Don't exaggerate the use of Singlish, and be natural, as how a real Singaporean would speak."\
f"Start the keys from {(index*NUM_SAMPLE)+1}. For example,"\
"{'X':{'s': 'aiyo, why like that', 'e': 'oh my, how did this happen'}"\
"..., 'X+5': {'s': 'don't play play', 'e': 'don't fool around'} }"}],
temperature=0.01,
response_format={"type":"json_object"}
)
output = response.choices[0].message.content
output_json = json.loads(output)
dataset.update(output_json)
# Save the current state of the combined dictionary
with open('singlish_to_english_v0.1.json', 'w') as f:
json.dump(dataset, f, indent=None)
# Convert to tabular csv
df = pd.read_json("singlish_to_english_v0.1.json")
df = df.T
df = df.reset_index()
df.columns = ["index", "singlish", "english"]
df.to_csv("singlish_to_english_v0.1.csv", index=False)
``` | cyzgab/singlish-to-english-synthetic | [
"task_categories:translation",
"size_categories:n<1K",
"language:en",
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2024-01-14T05:17:35+00:00 | {"language": ["en"], "license": "cc-by-nc-sa-4.0", "size_categories": ["n<1K"], "task_categories": ["translation"], "pretty_name": "Singlish to English \ud83c\uddf8\ud83c\uddec"} | 2024-01-14T07:44:18+00:00 |
2d0a331ff2bc9cad7c76f19338226757dc197e06 |
# Dataset of a_545/A-545/A-545 (Girls' Frontline)
This is the dataset of a_545/A-545/A-545 (Girls' Frontline), containing 20 images and their tags.
The core tags of this character are `long_hair, bangs, breasts, blonde_hair, braid, twintails, medium_breasts, hat, blue_eyes, aqua_eyes, beret, black_headwear, very_long_hair, braided_bangs, hair_ornament, hairclip`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 36.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_545_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 17.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_545_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 48 | 34.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_545_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 29.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_545_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 48 | 54.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_545_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/a_545_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, looking_at_viewer, solo, simple_background, white_background, assault_rifle, black_footwear, bodysuit, black_gloves, closed_mouth, smile, black_thighhighs, dress, alcohol, holding_bottle, full_body, high_heel_boots, holding_gun, sitting, thigh_boots |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | simple_background | white_background | assault_rifle | black_footwear | bodysuit | black_gloves | closed_mouth | smile | black_thighhighs | dress | alcohol | holding_bottle | full_body | high_heel_boots | holding_gun | sitting | thigh_boots |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------------------|:-------------------|:----------------|:-----------------|:-----------|:---------------|:---------------|:--------|:-------------------|:--------|:----------|:-----------------|:------------|:------------------|:--------------|:----------|:--------------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/a_545_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T05:23:34+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T05:27:55+00:00 |
52a7522832e6197b78444c871feea1dc69eb8da5 | andersonbcdefg/msmarco_triples_trunc | [
"region:us"
] | 2024-01-14T05:32:16+00:00 | {"dataset_info": {"features": [{"name": "query", "dtype": "string"}, {"name": "pos", "dtype": "string"}, {"name": "neg", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 370164820, "num_examples": 499184}], "download_size": 240326730, "dataset_size": 370164820}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T05:36:11+00:00 |
|
b70c9eff184e2e2759b53c4c88263573369985a2 |
# Dataset of 6p62/6P62/6P62 (Girls' Frontline)
This is the dataset of 6p62/6P62/6P62 (Girls' Frontline), containing 11 images and their tags.
The core tags of this character are `long_hair, red_hair, blue_eyes, hat, breasts, large_breasts, bangs, glasses, ponytail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 14.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/6p62_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 9.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/6p62_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 24 | 16.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/6p62_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 13.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/6p62_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 24 | 22.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/6p62_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/6p62_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, solo, pantyhose, looking_at_viewer, gun, long_sleeves, shirt, smile, thighhighs, boots, full_body, jacket, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | pantyhose | looking_at_viewer | gun | long_sleeves | shirt | smile | thighhighs | boots | full_body | jacket | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:------------|:--------------------|:------|:---------------|:--------|:--------|:-------------|:--------|:------------|:---------|:-------------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/6p62_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T05:44:56+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T05:48:16+00:00 |
5a23e0096725e77775d84294f32ca3a2dcbc5da5 |
# Dataset Card for Evaluation run of nisten/shqiponja-59b-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nisten/shqiponja-59b-v1](https://huggingface.co/nisten/shqiponja-59b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nisten__shqiponja-59b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T05:56:39.495831](https://huggingface.co/datasets/open-llm-leaderboard/details_nisten__shqiponja-59b-v1/blob/main/results_2024-01-14T05-56-39.495831.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7432378535506855,
"acc_stderr": 0.02859899074099913,
"acc_norm": 0.7559556232571321,
"acc_norm_stderr": 0.029186017628606568,
"mc1": 0.5373317013463892,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.7043324455434049,
"mc2_stderr": 0.014572093049489886
},
"harness|arc:challenge|25": {
"acc": 0.6757679180887372,
"acc_stderr": 0.01367881039951882,
"acc_norm": 0.7005119453924915,
"acc_norm_stderr": 0.01338502163731357
},
"harness|hellaswag|10": {
"acc": 0.6440948018323043,
"acc_stderr": 0.004778081784542404,
"acc_norm": 0.8405696076478789,
"acc_norm_stderr": 0.0036532880435558015
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.725925925925926,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.725925925925926,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8552631578947368,
"acc_stderr": 0.028631951845930387,
"acc_norm": 0.8552631578947368,
"acc_norm_stderr": 0.028631951845930387
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372274,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8819444444444444,
"acc_stderr": 0.026983346503309382,
"acc_norm": 0.8819444444444444,
"acc_norm_stderr": 0.026983346503309382
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.04971358884367406,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.04971358884367406
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7659574468085106,
"acc_stderr": 0.027678452578212394,
"acc_norm": 0.7659574468085106,
"acc_norm_stderr": 0.027678452578212394
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7034482758620689,
"acc_stderr": 0.03806142687309992,
"acc_norm": 0.7034482758620689,
"acc_norm_stderr": 0.03806142687309992
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5846560846560847,
"acc_stderr": 0.0253795249107784,
"acc_norm": 0.5846560846560847,
"acc_norm_stderr": 0.0253795249107784
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6031746031746031,
"acc_stderr": 0.0437588849272706,
"acc_norm": 0.6031746031746031,
"acc_norm_stderr": 0.0437588849272706
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9,
"acc_stderr": 0.01706640371965727,
"acc_norm": 0.9,
"acc_norm_stderr": 0.01706640371965727
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9242424242424242,
"acc_stderr": 0.018852670234993093,
"acc_norm": 0.9242424242424242,
"acc_norm_stderr": 0.018852670234993093
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.014385432857476444,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.014385432857476444
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.0199823472086373,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.0199823472086373
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476668,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8361344537815126,
"acc_stderr": 0.02404405494044049,
"acc_norm": 0.8361344537815126,
"acc_norm_stderr": 0.02404405494044049
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.040802441856289715,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.040802441856289715
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9174311926605505,
"acc_stderr": 0.01180036136301657,
"acc_norm": 0.9174311926605505,
"acc_norm_stderr": 0.01180036136301657
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.03191923445686185,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.03191923445686185
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.0277901770643836,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.0277901770643836
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9007633587786259,
"acc_stderr": 0.02622223517147737,
"acc_norm": 0.9007633587786259,
"acc_norm_stderr": 0.02622223517147737
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622793,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622793
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563275,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.026845765054553838,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.026845765054553838
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.047184714852195865,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.047184714852195865
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9487179487179487,
"acc_stderr": 0.014450181176872726,
"acc_norm": 0.9487179487179487,
"acc_norm_stderr": 0.014450181176872726
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8991060025542784,
"acc_stderr": 0.01077047201488671,
"acc_norm": 0.8991060025542784,
"acc_norm_stderr": 0.01077047201488671
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.021152676966575266,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.021152676966575266
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7854748603351955,
"acc_stderr": 0.013728923407828853,
"acc_norm": 0.7854748603351955,
"acc_norm_stderr": 0.013728923407828853
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.826797385620915,
"acc_stderr": 0.0216684002565143,
"acc_norm": 0.826797385620915,
"acc_norm_stderr": 0.0216684002565143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8038585209003215,
"acc_stderr": 0.022552447780478026,
"acc_norm": 0.8038585209003215,
"acc_norm_stderr": 0.022552447780478026
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.845679012345679,
"acc_stderr": 0.020100830999850994,
"acc_norm": 0.845679012345679,
"acc_norm_stderr": 0.020100830999850994
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5788787483702738,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.5788787483702738,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8014705882352942,
"acc_stderr": 0.024231013370541093,
"acc_norm": 0.8014705882352942,
"acc_norm_stderr": 0.024231013370541093
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.016011237996336938,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.016011237996336938
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.0250002560395462,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.0250002560395462
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5373317013463892,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.7043324455434049,
"mc2_stderr": 0.014572093049489886
},
"harness|winogrande|5": {
"acc": 0.8026835043409629,
"acc_stderr": 0.011185026389050366
},
"harness|gsm8k|5": {
"acc": 0.15466262319939347,
"acc_stderr": 0.009959786220917203
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_nisten__shqiponja-59b-v1 | [
"region:us"
] | 2024-01-14T05:58:53+00:00 | {"pretty_name": "Evaluation run of nisten/shqiponja-59b-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [nisten/shqiponja-59b-v1](https://huggingface.co/nisten/shqiponja-59b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nisten__shqiponja-59b-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T05:56:39.495831](https://huggingface.co/datasets/open-llm-leaderboard/details_nisten__shqiponja-59b-v1/blob/main/results_2024-01-14T05-56-39.495831.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7432378535506855,\n \"acc_stderr\": 0.02859899074099913,\n \"acc_norm\": 0.7559556232571321,\n \"acc_norm_stderr\": 0.029186017628606568,\n \"mc1\": 0.5373317013463892,\n \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.7043324455434049,\n \"mc2_stderr\": 0.014572093049489886\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6757679180887372,\n \"acc_stderr\": 0.01367881039951882,\n \"acc_norm\": 0.7005119453924915,\n \"acc_norm_stderr\": 0.01338502163731357\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6440948018323043,\n \"acc_stderr\": 0.004778081784542404,\n \"acc_norm\": 0.8405696076478789,\n \"acc_norm_stderr\": 0.0036532880435558015\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.725925925925926,\n \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.725925925925926,\n \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8552631578947368,\n \"acc_stderr\": 0.028631951845930387,\n \"acc_norm\": 0.8552631578947368,\n \"acc_norm_stderr\": 0.028631951845930387\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372274,\n \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372274\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8819444444444444,\n \"acc_stderr\": 0.026983346503309382,\n \"acc_norm\": 0.8819444444444444,\n \"acc_norm_stderr\": 0.026983346503309382\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.04971358884367406,\n \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.04971358884367406\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7659574468085106,\n \"acc_stderr\": 0.027678452578212394,\n \"acc_norm\": 0.7659574468085106,\n \"acc_norm_stderr\": 0.027678452578212394\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7034482758620689,\n \"acc_stderr\": 0.03806142687309992,\n \"acc_norm\": 0.7034482758620689,\n \"acc_norm_stderr\": 0.03806142687309992\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5846560846560847,\n \"acc_stderr\": 0.0253795249107784,\n \"acc_norm\": 0.5846560846560847,\n \"acc_norm_stderr\": 0.0253795249107784\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6031746031746031,\n \"acc_stderr\": 0.0437588849272706,\n \"acc_norm\": 0.6031746031746031,\n \"acc_norm_stderr\": 0.0437588849272706\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.01706640371965727,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.01706640371965727\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03344283744280458,\n \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03344283744280458\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9242424242424242,\n \"acc_stderr\": 0.018852670234993093,\n \"acc_norm\": 0.9242424242424242,\n \"acc_norm_stderr\": 0.018852670234993093\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.014385432857476444,\n \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.014385432857476444\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.0199823472086373,\n \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.0199823472086373\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476668,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476668\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8361344537815126,\n \"acc_stderr\": 0.02404405494044049,\n \"acc_norm\": 0.8361344537815126,\n \"acc_norm_stderr\": 0.02404405494044049\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.48344370860927155,\n \"acc_stderr\": 0.040802441856289715,\n \"acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.040802441856289715\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9174311926605505,\n \"acc_stderr\": 0.01180036136301657,\n \"acc_norm\": 0.9174311926605505,\n \"acc_norm_stderr\": 0.01180036136301657\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.03191923445686185,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.03191923445686185\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n \"acc_stderr\": 0.0277901770643836,\n \"acc_norm\": 0.7802690582959642,\n \"acc_norm_stderr\": 0.0277901770643836\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.9007633587786259,\n \"acc_stderr\": 0.02622223517147737,\n \"acc_norm\": 0.9007633587786259,\n \"acc_norm_stderr\": 0.02622223517147737\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622793,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622793\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.02923927267563275,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.02923927267563275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553838,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553838\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n \"acc_stderr\": 0.047184714852195865,\n \"acc_norm\": 0.5535714285714286,\n \"acc_norm_stderr\": 0.047184714852195865\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9487179487179487,\n \"acc_stderr\": 0.014450181176872726,\n \"acc_norm\": 0.9487179487179487,\n \"acc_norm_stderr\": 0.014450181176872726\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8991060025542784,\n \"acc_stderr\": 0.01077047201488671,\n \"acc_norm\": 0.8991060025542784,\n \"acc_norm_stderr\": 0.01077047201488671\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.021152676966575266,\n \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.021152676966575266\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7854748603351955,\n \"acc_stderr\": 0.013728923407828853,\n \"acc_norm\": 0.7854748603351955,\n \"acc_norm_stderr\": 0.013728923407828853\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.826797385620915,\n \"acc_stderr\": 0.0216684002565143,\n \"acc_norm\": 0.826797385620915,\n \"acc_norm_stderr\": 0.0216684002565143\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8038585209003215,\n \"acc_stderr\": 0.022552447780478026,\n \"acc_norm\": 0.8038585209003215,\n \"acc_norm_stderr\": 0.022552447780478026\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.845679012345679,\n \"acc_stderr\": 0.020100830999850994,\n \"acc_norm\": 0.845679012345679,\n \"acc_norm_stderr\": 0.020100830999850994\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6170212765957447,\n \"acc_stderr\": 0.02899908090480618,\n \"acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.02899908090480618\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5788787483702738,\n \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.5788787483702738,\n \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8014705882352942,\n \"acc_stderr\": 0.024231013370541093,\n \"acc_norm\": 0.8014705882352942,\n \"acc_norm_stderr\": 0.024231013370541093\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.016011237996336938,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.016011237996336938\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.0250002560395462,\n \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.0250002560395462\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5373317013463892,\n \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.7043324455434049,\n \"mc2_stderr\": 0.014572093049489886\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8026835043409629,\n \"acc_stderr\": 0.011185026389050366\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15466262319939347,\n \"acc_stderr\": 0.009959786220917203\n }\n}\n```", "repo_url": "https://huggingface.co/nisten/shqiponja-59b-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|arc:challenge|25_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|gsm8k|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hellaswag|10_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|winogrande|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["results_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T05-56-39.495831.parquet"]}]}]} | 2024-01-14T05:59:13+00:00 |
b15da7177f832c62959eb41930ffa4a466081880 | cristiansales/urafeminina | [
"license:openrail",
"region:us"
] | 2024-01-14T05:59:03+00:00 | {"license": "openrail"} | 2024-01-14T06:02:23+00:00 |
|
1f168f0b073b92a4af583cf865d304e50df2f4fa | Dataset collected from [PGB: A PubMed Graph Benchmark for Heterogeneous Network Representation Learning](https://arxiv.org/pdf/2305.02691.pdf)
Description :
inbound_citation: List List of PMID that cites the paper
outbound_citation: List References of the paper
PMID : Pubmed ID | bisectgroup/PMID_CITED_forKG | [
"arxiv:2305.02691",
"region:us"
] | 2024-01-14T06:01:13+00:00 | {} | 2024-01-14T08:29:27+00:00 |
9c66126684e2fc5c55e6302db9c54764eed80c40 | lazishu/SFT-llm-with-gnn | [
"region:us"
] | 2024-01-14T06:06:52+00:00 | {} | 2024-01-14T06:06:52+00:00 |
|
3d929c37784978d18f9e1ec5242bcd12548fff0f | satpalsr/translation-filter | [
"region:us"
] | 2024-01-14T06:14:28+00:00 | {} | 2024-01-14T06:14:40+00:00 |
|
b917c17240b4e9e5bf9551b09f2ef29ba9c66b5a |
# Dataset Card for Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1](https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T06:20:20.648218](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v1/blob/main/results_2024-01-14T06-20-20.648218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6191660640057981,
"acc_stderr": 0.03263652891344978,
"acc_norm": 0.6271945727055741,
"acc_norm_stderr": 0.03333445432068468,
"mc1": 0.43329253365973075,
"mc1_stderr": 0.017347024450107492,
"mc2": 0.5997212380160826,
"mc2_stderr": 0.015696061571327326
},
"harness|arc:challenge|25": {
"acc": 0.6271331058020477,
"acc_stderr": 0.014131176760131167,
"acc_norm": 0.6604095563139932,
"acc_norm_stderr": 0.01383903976282017
},
"harness|hellaswag|10": {
"acc": 0.6580362477594105,
"acc_stderr": 0.004733980470799212,
"acc_norm": 0.8462457677753435,
"acc_norm_stderr": 0.0035997580435468044
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155243,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155243
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110932,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110932
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612907,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612907
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229962,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229962
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266196,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266196
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879716,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3865921787709497,
"acc_stderr": 0.016286674879101022,
"acc_norm": 0.3865921787709497,
"acc_norm_stderr": 0.016286674879101022
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914389,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914389
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43285528031290743,
"acc_stderr": 0.012654565234622868,
"acc_norm": 0.43285528031290743,
"acc_norm_stderr": 0.012654565234622868
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928007,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928007
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.01922832201869664,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.01922832201869664
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43329253365973075,
"mc1_stderr": 0.017347024450107492,
"mc2": 0.5997212380160826,
"mc2_stderr": 0.015696061571327326
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.011585871710209408
},
"harness|gsm8k|5": {
"acc": 0.20318423047763456,
"acc_stderr": 0.011083227665267797
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v1 | [
"region:us"
] | 2024-01-14T06:22:39+00:00 | {"pretty_name": "Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1](https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T06:20:20.648218](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v1/blob/main/results_2024-01-14T06-20-20.648218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6191660640057981,\n \"acc_stderr\": 0.03263652891344978,\n \"acc_norm\": 0.6271945727055741,\n \"acc_norm_stderr\": 0.03333445432068468,\n \"mc1\": 0.43329253365973075,\n \"mc1_stderr\": 0.017347024450107492,\n \"mc2\": 0.5997212380160826,\n \"mc2_stderr\": 0.015696061571327326\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.014131176760131167,\n \"acc_norm\": 0.6604095563139932,\n \"acc_norm_stderr\": 0.01383903976282017\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6580362477594105,\n \"acc_stderr\": 0.004733980470799212,\n \"acc_norm\": 0.8462457677753435,\n \"acc_norm_stderr\": 0.0035997580435468044\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835771,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835771\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155243,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155243\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110932,\n \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110932\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612907,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612907\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229962,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229962\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879716,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879716\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3865921787709497,\n \"acc_stderr\": 0.016286674879101022,\n \"acc_norm\": 0.3865921787709497,\n \"acc_norm_stderr\": 0.016286674879101022\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914389,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914389\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889016,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889016\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43285528031290743,\n \"acc_stderr\": 0.012654565234622868,\n \"acc_norm\": 0.43285528031290743,\n \"acc_norm_stderr\": 0.012654565234622868\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928007,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928007\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.01922832201869664,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.01922832201869664\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43329253365973075,\n \"mc1_stderr\": 0.017347024450107492,\n \"mc2\": 0.5997212380160826,\n \"mc2_stderr\": 0.015696061571327326\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.011585871710209408\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20318423047763456,\n \"acc_stderr\": 0.011083227665267797\n }\n}\n```", "repo_url": "https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|arc:challenge|25_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|gsm8k|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hellaswag|10_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|winogrande|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["results_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T06-20-20.648218.parquet"]}]}]} | 2024-01-14T06:22:59+00:00 |
56ad754beaf6c8076efc89d4aa8e1a119d8a0608 | AsphyXIA/baarat_hi_small | [
"license:mit",
"region:us"
] | 2024-01-14T06:25:31+00:00 | {"license": "mit", "dataset_info": {"features": [{"name": "idx", "dtype": "int64"}, {"name": "src", "dtype": "string"}, {"name": "tgt", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1799065325.5, "num_examples": 5062853}], "download_size": 969865534, "dataset_size": 1799065325.5}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T06:32:44+00:00 |
|
bb5ffa562bac8e29787dd02e4ceb22ece8366ec3 | icewiny/blurred_image_coyo_1M | [
"license:mit",
"region:us"
] | 2024-01-14T06:28:41+00:00 | {"license": "mit", "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "blurred_img", "dtype": "image"}, {"name": "caption", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 956490063.0, "num_examples": 6000}], "download_size": 952136439, "dataset_size": 956490063.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T15:05:01+00:00 |
|
defc20a14952a9d5a4c8f97f4ed14aeb5cd01f17 |
# Dataset of hunter/ハンター/猎人 (Azur Lane)
This is the dataset of hunter/ハンター/猎人 (Azur Lane), containing 24 images and their tags.
The core tags of this character are `hat, long_hair, red_eyes, brown_hair, bangs, blonde_hair, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 26.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hunter_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 15.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hunter_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 48 | 31.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hunter_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 23.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hunter_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 48 | 43.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hunter_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hunter_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, solo, tricorne, gloves, scarf, navel, shorts, belt, gun, looking_at_viewer, midriff, thighhighs, boots, jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | tricorne | gloves | scarf | navel | shorts | belt | gun | looking_at_viewer | midriff | thighhighs | boots | jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:---------|:--------|:--------|:---------|:-------|:------|:--------------------|:----------|:-------------|:--------|:---------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/hunter_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T06:38:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T06:43:41+00:00 |
9f816323a3958315604cbeac9fb517753b359b39 |
# Dataset of bulldog/ブルドッグ/大斗犬 (Azur Lane)
This is the dataset of bulldog/ブルドッグ/大斗犬 (Azur Lane), containing 17 images and their tags.
The core tags of this character are `hair_ornament, short_hair, red_eyes, white_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 11.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bulldog_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 8.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bulldog_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 37 | 17.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bulldog_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 11.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bulldog_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 37 | 22.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bulldog_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/bulldog_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, solo, white_gloves, simple_background, blush, pleated_skirt, short_sleeves, white_background, looking_at_viewer, closed_mouth, white_shirt, full_body, white_skirt, black_thighhighs, brooch, thigh_strap |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | white_gloves | simple_background | blush | pleated_skirt | short_sleeves | white_background | looking_at_viewer | closed_mouth | white_shirt | full_body | white_skirt | black_thighhighs | brooch | thigh_strap |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------------------|:--------|:----------------|:----------------|:-------------------|:--------------------|:---------------|:--------------|:------------|:--------------|:-------------------|:---------|:--------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/bulldog_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T06:38:39+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T06:42:06+00:00 |
a0d5cb8401b1224e95d7f0762981b12b0768527b |
# Dataset Card for Evaluation run of ewqr2130/TinyLamma-SFT
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ewqr2130/TinyLamma-SFT](https://huggingface.co/ewqr2130/TinyLamma-SFT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ewqr2130__TinyLamma-SFT",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T06:47:16.082235](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__TinyLamma-SFT/blob/main/results_2024-01-14T06-47-16.082235.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2490861506190792,
"acc_stderr": 0.030409991529850307,
"acc_norm": 0.25019487136318186,
"acc_norm_stderr": 0.031150216526222626,
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807763,
"mc2": 0.372025520096151,
"mc2_stderr": 0.013802667425788874
},
"harness|arc:challenge|25": {
"acc": 0.3174061433447099,
"acc_stderr": 0.01360223908803817,
"acc_norm": 0.3438566552901024,
"acc_norm_stderr": 0.013880644570156213
},
"harness|hellaswag|10": {
"acc": 0.4475204142601075,
"acc_stderr": 0.004962220512548357,
"acc_norm": 0.5914160525791675,
"acc_norm_stderr": 0.004905674408614011
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1925925925925926,
"acc_stderr": 0.03406542058502653,
"acc_norm": 0.1925925925925926,
"acc_norm_stderr": 0.03406542058502653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.15789473684210525,
"acc_stderr": 0.029674167520101474,
"acc_norm": 0.15789473684210525,
"acc_norm_stderr": 0.029674167520101474
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566019,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566019
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.19653179190751446,
"acc_stderr": 0.030299574664788147,
"acc_norm": 0.19653179190751446,
"acc_norm_stderr": 0.030299574664788147
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617748,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617748
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2297872340425532,
"acc_stderr": 0.027501752944412424,
"acc_norm": 0.2297872340425532,
"acc_norm_stderr": 0.027501752944412424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022057,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022057
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708624,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708624
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.037649508797906066,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.037649508797906066
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.024892469172462826,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.024892469172462826
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21182266009852216,
"acc_stderr": 0.028748983689941054,
"acc_norm": 0.21182266009852216,
"acc_norm_stderr": 0.028748983689941054
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.03051611137147601,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.03051611137147601
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.02075242372212801,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.02075242372212801
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868963,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868963
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804724,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804724
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21284403669724772,
"acc_stderr": 0.01754937638931369,
"acc_norm": 0.21284403669724772,
"acc_norm_stderr": 0.01754937638931369
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.031141447823536023,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.031141447823536023
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693254,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693254
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2869198312236287,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.2869198312236287,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34080717488789236,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.34080717488789236,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094634,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094634
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.1262135922330097,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.1262135922330097,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749475,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749475
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2720306513409962,
"acc_stderr": 0.015913367447500517,
"acc_norm": 0.2720306513409962,
"acc_norm_stderr": 0.015913367447500517
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.022797110278071134,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.022797110278071134
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.01446589382985993,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.01446589382985993
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3022508038585209,
"acc_stderr": 0.026082700695399672,
"acc_norm": 0.3022508038585209,
"acc_norm_stderr": 0.026082700695399672
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967277,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432414,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432414
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19852941176470587,
"acc_stderr": 0.024231013370541107,
"acc_norm": 0.19852941176470587,
"acc_norm_stderr": 0.024231013370541107
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26633986928104575,
"acc_stderr": 0.017883188134667195,
"acc_norm": 0.26633986928104575,
"acc_norm_stderr": 0.017883188134667195
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.14285714285714285,
"acc_stderr": 0.022401787435256396,
"acc_norm": 0.14285714285714285,
"acc_norm_stderr": 0.022401787435256396
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553026,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553026
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807763,
"mc2": 0.372025520096151,
"mc2_stderr": 0.013802667425788874
},
"harness|winogrande|5": {
"acc": 0.5864246250986582,
"acc_stderr": 0.013840971763195303
},
"harness|gsm8k|5": {
"acc": 0.016679302501895376,
"acc_stderr": 0.0035275958887224265
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ewqr2130__TinyLamma-SFT | [
"region:us"
] | 2024-01-14T06:49:04+00:00 | {"pretty_name": "Evaluation run of ewqr2130/TinyLamma-SFT", "dataset_summary": "Dataset automatically created during the evaluation run of model [ewqr2130/TinyLamma-SFT](https://huggingface.co/ewqr2130/TinyLamma-SFT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewqr2130__TinyLamma-SFT\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T06:47:16.082235](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__TinyLamma-SFT/blob/main/results_2024-01-14T06-47-16.082235.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2490861506190792,\n \"acc_stderr\": 0.030409991529850307,\n \"acc_norm\": 0.25019487136318186,\n \"acc_norm_stderr\": 0.031150216526222626,\n \"mc1\": 0.23011015911872704,\n \"mc1_stderr\": 0.014734557959807763,\n \"mc2\": 0.372025520096151,\n \"mc2_stderr\": 0.013802667425788874\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3174061433447099,\n \"acc_stderr\": 0.01360223908803817,\n \"acc_norm\": 0.3438566552901024,\n \"acc_norm_stderr\": 0.013880644570156213\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4475204142601075,\n \"acc_stderr\": 0.004962220512548357,\n \"acc_norm\": 0.5914160525791675,\n \"acc_norm_stderr\": 0.004905674408614011\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n \"acc_stderr\": 0.03406542058502653,\n \"acc_norm\": 0.1925925925925926,\n \"acc_norm_stderr\": 0.03406542058502653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.15789473684210525,\n \"acc_stderr\": 0.029674167520101474,\n \"acc_norm\": 0.15789473684210525,\n \"acc_norm_stderr\": 0.029674167520101474\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n \"acc_stderr\": 0.03716177437566019,\n \"acc_norm\": 0.2708333333333333,\n \"acc_norm_stderr\": 0.03716177437566019\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.19653179190751446,\n \"acc_stderr\": 0.030299574664788147,\n \"acc_norm\": 0.19653179190751446,\n \"acc_norm_stderr\": 0.030299574664788147\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617748,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617748\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2297872340425532,\n \"acc_stderr\": 0.027501752944412424,\n \"acc_norm\": 0.2297872340425532,\n \"acc_norm_stderr\": 0.027501752944412424\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022057,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022057\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708624,\n \"acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708624\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n \"acc_stderr\": 0.037649508797906066,\n \"acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.037649508797906066\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n \"acc_stderr\": 0.024892469172462826,\n \"acc_norm\": 0.25806451612903225,\n \"acc_norm_stderr\": 0.024892469172462826\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.21182266009852216,\n \"acc_stderr\": 0.028748983689941054,\n \"acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.028748983689941054\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.03051611137147601,\n \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.03051611137147601\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.02075242372212801,\n \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.02075242372212801\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868963,\n \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868963\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804724,\n \"acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804724\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.21284403669724772,\n \"acc_stderr\": 0.01754937638931369,\n \"acc_norm\": 0.21284403669724772,\n \"acc_norm_stderr\": 0.01754937638931369\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.031141447823536023,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.031141447823536023\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693254,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693254\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2869198312236287,\n \"acc_stderr\": 0.02944377302259469,\n \"acc_norm\": 0.2869198312236287,\n \"acc_norm_stderr\": 0.02944377302259469\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.34080717488789236,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.04236511258094634,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.04236511258094634\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1262135922330097,\n \"acc_stderr\": 0.03288180278808628,\n \"acc_norm\": 0.1262135922330097,\n \"acc_norm_stderr\": 0.03288180278808628\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n \"acc_stderr\": 0.028911208802749475,\n \"acc_norm\": 0.26495726495726496,\n \"acc_norm_stderr\": 0.028911208802749475\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2720306513409962,\n \"acc_stderr\": 0.015913367447500517,\n \"acc_norm\": 0.2720306513409962,\n \"acc_norm_stderr\": 0.015913367447500517\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.022797110278071134,\n \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.022797110278071134\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n \"acc_stderr\": 0.01446589382985993,\n \"acc_norm\": 0.24916201117318434,\n \"acc_norm_stderr\": 0.01446589382985993\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912258,\n \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912258\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3022508038585209,\n \"acc_stderr\": 0.026082700695399672,\n \"acc_norm\": 0.3022508038585209,\n \"acc_norm_stderr\": 0.026082700695399672\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967277,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967277\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432414,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432414\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.024231013370541107,\n \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.024231013370541107\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.26633986928104575,\n \"acc_stderr\": 0.017883188134667195,\n \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.017883188134667195\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.24545454545454545,\n \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.14285714285714285,\n \"acc_stderr\": 0.022401787435256396,\n \"acc_norm\": 0.14285714285714285,\n \"acc_norm_stderr\": 0.022401787435256396\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.03460579907553026,\n \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.03460579907553026\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.034678266857038266,\n \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.034678266857038266\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23011015911872704,\n \"mc1_stderr\": 0.014734557959807763,\n \"mc2\": 0.372025520096151,\n \"mc2_stderr\": 0.013802667425788874\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5864246250986582,\n \"acc_stderr\": 0.013840971763195303\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.016679302501895376,\n \"acc_stderr\": 0.0035275958887224265\n }\n}\n```", "repo_url": "https://huggingface.co/ewqr2130/TinyLamma-SFT", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|arc:challenge|25_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|gsm8k|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hellaswag|10_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|winogrande|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["results_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T06-47-16.082235.parquet"]}]}]} | 2024-01-14T06:49:24+00:00 |
f11af3a2f535da364d3fa988ca136b89aa859cfa |
# Dataset Card for Evaluation run of Suprit/Zhongjing-LLaMA-base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Suprit/Zhongjing-LLaMA-base](https://huggingface.co/Suprit/Zhongjing-LLaMA-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Suprit__Zhongjing-LLaMA-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T06:48:13.310278](https://huggingface.co/datasets/open-llm-leaderboard/details_Suprit__Zhongjing-LLaMA-base/blob/main/results_2024-01-14T06-48-13.310278.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48560692489095947,
"acc_stderr": 0.03450713063212824,
"acc_norm": 0.48879741973292123,
"acc_norm_stderr": 0.03524925803152966,
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.4888307270560722,
"mc2_stderr": 0.015123753734506709
},
"harness|arc:challenge|25": {
"acc": 0.5196245733788396,
"acc_stderr": 0.0146001320759471,
"acc_norm": 0.5511945392491467,
"acc_norm_stderr": 0.014534599585097664
},
"harness|hellaswag|10": {
"acc": 0.6026687910774746,
"acc_stderr": 0.004883455188908963,
"acc_norm": 0.7971519617606054,
"acc_norm_stderr": 0.004012984497778308
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.040403110624904356,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.040403110624904356
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.037940126746970275,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.037940126746970275
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101737,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101737
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.38620689655172413,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.38620689655172413,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.0437588849272706,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.0437588849272706
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5451612903225806,
"acc_stderr": 0.028327743091561074,
"acc_norm": 0.5451612903225806,
"acc_norm_stderr": 0.028327743091561074
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6161616161616161,
"acc_stderr": 0.03464881675016339,
"acc_norm": 0.6161616161616161,
"acc_norm_stderr": 0.03464881675016339
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6994818652849741,
"acc_stderr": 0.033088185944157494,
"acc_norm": 0.6994818652849741,
"acc_norm_stderr": 0.033088185944157494
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4564102564102564,
"acc_stderr": 0.025254485424799595,
"acc_norm": 0.4564102564102564,
"acc_norm_stderr": 0.025254485424799595
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871923,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871923
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.48739495798319327,
"acc_stderr": 0.032468167657521745,
"acc_norm": 0.48739495798319327,
"acc_norm_stderr": 0.032468167657521745
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6403669724770642,
"acc_stderr": 0.020575234660123776,
"acc_norm": 0.6403669724770642,
"acc_norm_stderr": 0.020575234660123776
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.031141447823536027,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.031141447823536027
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03471157907953427,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03471157907953427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.679324894514768,
"acc_stderr": 0.03038193194999041,
"acc_norm": 0.679324894514768,
"acc_norm_stderr": 0.03038193194999041
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4618834080717489,
"acc_stderr": 0.03346015011973228,
"acc_norm": 0.4618834080717489,
"acc_norm_stderr": 0.03346015011973228
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.04507732278775087,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.04507732278775087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5644171779141104,
"acc_stderr": 0.03895632464138938,
"acc_norm": 0.5644171779141104,
"acc_norm_stderr": 0.03895632464138938
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.046355501356099754,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.046355501356099754
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.047504583990416946,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.047504583990416946
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7478632478632479,
"acc_stderr": 0.02844796547623102,
"acc_norm": 0.7478632478632479,
"acc_norm_stderr": 0.02844796547623102
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6845466155810983,
"acc_stderr": 0.016617501738763397,
"acc_norm": 0.6845466155810983,
"acc_norm_stderr": 0.016617501738763397
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.026902900458666647,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.026902900458666647
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.014756906483260664,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.014756906483260664
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.028580341065138296,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.028580341065138296
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5466237942122186,
"acc_stderr": 0.028274359854894248,
"acc_norm": 0.5466237942122186,
"acc_norm_stderr": 0.028274359854894248
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5771604938271605,
"acc_stderr": 0.02748747298087159,
"acc_norm": 0.5771604938271605,
"acc_norm_stderr": 0.02748747298087159
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.028538650028878645,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.028538650028878645
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34615384615384615,
"acc_stderr": 0.012150699768228553,
"acc_norm": 0.34615384615384615,
"acc_norm_stderr": 0.012150699768228553
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47549019607843135,
"acc_stderr": 0.020203517280261443,
"acc_norm": 0.47549019607843135,
"acc_norm_stderr": 0.020203517280261443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.031987615467631264,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.031987615467631264
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03333333333333334,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03333333333333334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.4888307270560722,
"mc2_stderr": 0.015123753734506709
},
"harness|winogrande|5": {
"acc": 0.7482241515390686,
"acc_stderr": 0.012198489100259785
},
"harness|gsm8k|5": {
"acc": 0.2608036391205459,
"acc_stderr": 0.012094252417332734
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Suprit__Zhongjing-LLaMA-base | [
"region:us"
] | 2024-01-14T06:50:02+00:00 | {"pretty_name": "Evaluation run of Suprit/Zhongjing-LLaMA-base", "dataset_summary": "Dataset automatically created during the evaluation run of model [Suprit/Zhongjing-LLaMA-base](https://huggingface.co/Suprit/Zhongjing-LLaMA-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Suprit__Zhongjing-LLaMA-base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T06:48:13.310278](https://huggingface.co/datasets/open-llm-leaderboard/details_Suprit__Zhongjing-LLaMA-base/blob/main/results_2024-01-14T06-48-13.310278.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48560692489095947,\n \"acc_stderr\": 0.03450713063212824,\n \"acc_norm\": 0.48879741973292123,\n \"acc_norm_stderr\": 0.03524925803152966,\n \"mc1\": 0.34761321909424725,\n \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.4888307270560722,\n \"mc2_stderr\": 0.015123753734506709\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5196245733788396,\n \"acc_stderr\": 0.0146001320759471,\n \"acc_norm\": 0.5511945392491467,\n \"acc_norm_stderr\": 0.014534599585097664\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6026687910774746,\n \"acc_stderr\": 0.004883455188908963,\n \"acc_norm\": 0.7971519617606054,\n \"acc_norm_stderr\": 0.004012984497778308\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.040403110624904356,\n \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.040403110624904356\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n \"acc_stderr\": 0.037940126746970275,\n \"acc_norm\": 0.4508670520231214,\n \"acc_norm_stderr\": 0.037940126746970275\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101737,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101737\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.0437588849272706,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.0437588849272706\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5451612903225806,\n \"acc_stderr\": 0.028327743091561074,\n \"acc_norm\": 0.5451612903225806,\n \"acc_norm_stderr\": 0.028327743091561074\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6161616161616161,\n \"acc_stderr\": 0.03464881675016339,\n \"acc_norm\": 0.6161616161616161,\n \"acc_norm_stderr\": 0.03464881675016339\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6994818652849741,\n \"acc_stderr\": 0.033088185944157494,\n \"acc_norm\": 0.6994818652849741,\n \"acc_norm_stderr\": 0.033088185944157494\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4564102564102564,\n \"acc_stderr\": 0.025254485424799595,\n \"acc_norm\": 0.4564102564102564,\n \"acc_norm_stderr\": 0.025254485424799595\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871923,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871923\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.48739495798319327,\n \"acc_stderr\": 0.032468167657521745,\n \"acc_norm\": 0.48739495798319327,\n \"acc_norm_stderr\": 0.032468167657521745\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6403669724770642,\n \"acc_stderr\": 0.020575234660123776,\n \"acc_norm\": 0.6403669724770642,\n \"acc_norm_stderr\": 0.020575234660123776\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.031141447823536027,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.031141447823536027\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.03471157907953427,\n \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03471157907953427\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.679324894514768,\n \"acc_stderr\": 0.03038193194999041,\n \"acc_norm\": 0.679324894514768,\n \"acc_norm_stderr\": 0.03038193194999041\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4618834080717489,\n \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.4618834080717489,\n \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5785123966942148,\n \"acc_stderr\": 0.04507732278775087,\n \"acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.04507732278775087\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5644171779141104,\n \"acc_stderr\": 0.03895632464138938,\n \"acc_norm\": 0.5644171779141104,\n \"acc_norm_stderr\": 0.03895632464138938\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.046355501356099754,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.046355501356099754\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.047504583990416946,\n \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.047504583990416946\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7478632478632479,\n \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.7478632478632479,\n \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6845466155810983,\n \"acc_stderr\": 0.016617501738763397,\n \"acc_norm\": 0.6845466155810983,\n \"acc_norm_stderr\": 0.016617501738763397\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.026902900458666647,\n \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.026902900458666647\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n \"acc_stderr\": 0.014756906483260664,\n \"acc_norm\": 0.264804469273743,\n \"acc_norm_stderr\": 0.014756906483260664\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.028580341065138296,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.028580341065138296\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5466237942122186,\n \"acc_stderr\": 0.028274359854894248,\n \"acc_norm\": 0.5466237942122186,\n \"acc_norm_stderr\": 0.028274359854894248\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5771604938271605,\n \"acc_stderr\": 0.02748747298087159,\n \"acc_norm\": 0.5771604938271605,\n \"acc_norm_stderr\": 0.02748747298087159\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3546099290780142,\n \"acc_stderr\": 0.028538650028878645,\n \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.028538650028878645\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34615384615384615,\n \"acc_stderr\": 0.012150699768228553,\n \"acc_norm\": 0.34615384615384615,\n \"acc_norm_stderr\": 0.012150699768228553\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275675,\n \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275675\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.47549019607843135,\n \"acc_stderr\": 0.020203517280261443,\n \"acc_norm\": 0.47549019607843135,\n \"acc_norm_stderr\": 0.020203517280261443\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.031987615467631264,\n \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.031987615467631264\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03333333333333334,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03333333333333334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34761321909424725,\n \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.4888307270560722,\n \"mc2_stderr\": 0.015123753734506709\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7482241515390686,\n \"acc_stderr\": 0.012198489100259785\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2608036391205459,\n \"acc_stderr\": 0.012094252417332734\n }\n}\n```", "repo_url": "https://huggingface.co/Suprit/Zhongjing-LLaMA-base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|arc:challenge|25_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|gsm8k|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hellaswag|10_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|winogrande|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["results_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T06-48-13.310278.parquet"]}]}]} | 2024-01-14T06:50:23+00:00 |
699e571c33c977be08b9ac4c584bd9600c595b7b |
# Dataset of craven/クレイヴン/克雷文 (Azur Lane)
This is the dataset of craven/クレイヴン/克雷文 (Azur Lane), containing 17 images and their tags.
The core tags of this character are `long_hair, purple_hair, drill_hair, yellow_eyes, bangs, breasts, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 14.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/craven_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 10.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/craven_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 36 | 20.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/craven_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 13.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/craven_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 36 | 25.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/craven_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/craven_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, blush, smile, looking_at_viewer, solo, open_mouth, white_thighhighs, navel, pleated_skirt, full_body, sailor_collar, school_uniform, shirt, shoes, standing, cheerleader, collarbone, long_sleeves, midriff, one_eye_closed, pom_pom_(cheerleading), white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | smile | looking_at_viewer | solo | open_mouth | white_thighhighs | navel | pleated_skirt | full_body | sailor_collar | school_uniform | shirt | shoes | standing | cheerleader | collarbone | long_sleeves | midriff | one_eye_closed | pom_pom_(cheerleading) | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:--------------------|:-------|:-------------|:-------------------|:--------|:----------------|:------------|:----------------|:-----------------|:--------|:--------|:-----------|:--------------|:-------------|:---------------|:----------|:-----------------|:-------------------------|:-------------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/craven_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T06:54:58+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T06:58:59+00:00 |
170a5823e7cb0f7da2c7d9afe9e703b58f6c009d |
# Dataset of stephen_potter/ステフェン・ポッター/史蒂芬·波特 (Azur Lane)
This is the dataset of stephen_potter/ステフェン・ポッター/史蒂芬·波特 (Azur Lane), containing 21 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, hat, braid, hair_ornament, breasts, beret`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 21 | 32.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stephen_potter_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 21 | 16.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stephen_potter_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 54 | 36.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stephen_potter_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 21 | 28.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stephen_potter_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 54 | 55.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stephen_potter_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/stephen_potter_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------|
| 0 | 21 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, navel, pantyhose, sailor_collar, shorts, simple_background, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | blush | navel | pantyhose | sailor_collar | shorts | simple_background | open_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:--------|:------------|:----------------|:---------|:--------------------|:-------------|
| 0 | 21 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/stephen_potter_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T06:55:00+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T07:02:06+00:00 |
2b3efafef5e1dd19be0777c370ee23b14e1421a6 |
# Dataset of nowaki/野分/野分 (Azur Lane)
This is the dataset of nowaki/野分/野分 (Azur Lane), containing 12 images and their tags.
The core tags of this character are `ahoge, long_hair, black_hair, brown_eyes, headgear, yellow_eyes, very_long_hair, bangs, breasts, blue_ribbon, small_breasts, bow, brown_hair, hair_between_eyes, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 10.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nowaki_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 7.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nowaki_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 29 | 15.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nowaki_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 10.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nowaki_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 29 | 18.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nowaki_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nowaki_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | blush, 1girl, looking_at_viewer, solo, black_skirt, midriff, navel, pleated_skirt, collarbone, detached_sleeves, parted_lips, sailor_collar, white_shirt, crop_top, simple_background, single_thighhigh, sleeveless, white_background, bare_shoulders, wide_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | 1girl | looking_at_viewer | solo | black_skirt | midriff | navel | pleated_skirt | collarbone | detached_sleeves | parted_lips | sailor_collar | white_shirt | crop_top | simple_background | single_thighhigh | sleeveless | white_background | bare_shoulders | wide_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:--------------|:----------|:--------|:----------------|:-------------|:-------------------|:--------------|:----------------|:--------------|:-----------|:--------------------|:-------------------|:-------------|:-------------------|:-----------------|:---------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/nowaki_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T06:55:24+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T06:58:21+00:00 |
119cf708cf90865b277aa699c8ae65d0ca2daded |
# Dataset Card for Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2](https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T06:58:33.296331](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v2/blob/main/results_2024-01-14T06-58-33.296331.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6259237317793794,
"acc_stderr": 0.032605779344111435,
"acc_norm": 0.630765473051009,
"acc_norm_stderr": 0.0332633311611487,
"mc1": 0.4467564259485924,
"mc1_stderr": 0.017403977522557144,
"mc2": 0.6239282651782372,
"mc2_stderr": 0.015497224514490227
},
"harness|arc:challenge|25": {
"acc": 0.6305460750853242,
"acc_stderr": 0.014104578366491894,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.013830568927974332
},
"harness|hellaswag|10": {
"acc": 0.6633140808603863,
"acc_stderr": 0.004716106475905091,
"acc_norm": 0.8490340569607648,
"acc_norm_stderr": 0.003572839969521999
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520203,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520203
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.02489246917246283,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.02489246917246283
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.024603626924097417,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.024603626924097417
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514566,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514566
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135363,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135363
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612893,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612893
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069436,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516304,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516304
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066293,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066293
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.02468531686725781,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.02468531686725781
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37318435754189944,
"acc_stderr": 0.01617569201338196,
"acc_norm": 0.37318435754189944,
"acc_norm_stderr": 0.01617569201338196
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.012685906538206242,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.012685906538206242
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.02916312857067073,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.02916312857067073
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4467564259485924,
"mc1_stderr": 0.017403977522557144,
"mc2": 0.6239282651782372,
"mc2_stderr": 0.015497224514490227
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090252
},
"harness|gsm8k|5": {
"acc": 0.39651250947687644,
"acc_stderr": 0.013474258584033352
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v2 | [
"region:us"
] | 2024-01-14T07:00:54+00:00 | {"pretty_name": "Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2](https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T06:58:33.296331](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v2/blob/main/results_2024-01-14T06-58-33.296331.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6259237317793794,\n \"acc_stderr\": 0.032605779344111435,\n \"acc_norm\": 0.630765473051009,\n \"acc_norm_stderr\": 0.0332633311611487,\n \"mc1\": 0.4467564259485924,\n \"mc1_stderr\": 0.017403977522557144,\n \"mc2\": 0.6239282651782372,\n \"mc2_stderr\": 0.015497224514490227\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6305460750853242,\n \"acc_stderr\": 0.014104578366491894,\n \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.013830568927974332\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6633140808603863,\n \"acc_stderr\": 0.004716106475905091,\n \"acc_norm\": 0.8490340569607648,\n \"acc_norm_stderr\": 0.003572839969521999\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998905,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998905\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520203,\n \"acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520203\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n \"acc_stderr\": 0.02489246917246283,\n \"acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.02489246917246283\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139404,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139404\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097417,\n \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097417\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514566,\n \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514566\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135363,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135363\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612893,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612893\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069436,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069436\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516304,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516304\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066293,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066293\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.02468531686725781,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.02468531686725781\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37318435754189944,\n \"acc_stderr\": 0.01617569201338196,\n \"acc_norm\": 0.37318435754189944,\n \"acc_norm_stderr\": 0.01617569201338196\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n \"acc_stderr\": 0.012685906538206242,\n \"acc_norm\": 0.4426336375488918,\n \"acc_norm_stderr\": 0.012685906538206242\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.02916312857067073,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.02916312857067073\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4467564259485924,\n \"mc1_stderr\": 0.017403977522557144,\n \"mc2\": 0.6239282651782372,\n \"mc2_stderr\": 0.015497224514490227\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090252\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39651250947687644,\n \"acc_stderr\": 0.013474258584033352\n }\n}\n```", "repo_url": "https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|arc:challenge|25_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|gsm8k|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hellaswag|10_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|winogrande|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["results_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T06-58-33.296331.parquet"]}]}]} | 2024-01-14T07:01:14+00:00 |
58ad295e5d95c9df14c65f7b78747f46444b8cf5 |
# Dataset of halsey_powell/ハルゼー・パウエル/哈尔西·鲍威尔 (Azur Lane)
This is the dataset of halsey_powell/ハルゼー・パウエル/哈尔西·鲍威尔 (Azur Lane), containing 20 images and their tags.
The core tags of this character are `blue_eyes, long_hair, breasts, ahoge, grey_hair, hair_ornament, twintails, hairclip, bangs, small_breasts, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 22.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/halsey_powell_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 14.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/halsey_powell_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 45 | 27.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/halsey_powell_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 21.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/halsey_powell_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 45 | 37.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/halsey_powell_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/halsey_powell_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, looking_at_viewer, solo, detached_sleeves, necktie, blush, dress, white_thighhighs, simple_background, white_background, sailor_collar, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | detached_sleeves | necktie | blush | dress | white_thighhighs | simple_background | white_background | sailor_collar | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------------------|:----------|:--------|:--------|:-------------------|:--------------------|:-------------------|:----------------|:--------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/halsey_powell_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T07:10:39+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T07:14:59+00:00 |
097e130d0e6c075dab7759116a35bf8e66483a56 |
# Dataset of chkalov/チカロフ/契卡洛夫 (Azur Lane)
This is the dataset of chkalov/チカロフ/契卡洛夫 (Azur Lane), containing 20 images and their tags.
The core tags of this character are `breasts, long_hair, large_breasts, yellow_eyes, bangs, grey_hair, hair_between_eyes, mole, mole_under_eye, parted_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 30.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chkalov_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 14.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chkalov_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 52 | 34.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chkalov_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 25.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chkalov_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 52 | 52.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chkalov_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chkalov_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, looking_at_viewer, cleavage, solo, blush, smile, black_choker, black_gloves, black_shirt, open_clothes, white_coat, collarbone, jewelry, parted_lips, skirt, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | cleavage | solo | blush | smile | black_choker | black_gloves | black_shirt | open_clothes | white_coat | collarbone | jewelry | parted_lips | skirt | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-----------|:-------|:--------|:--------|:---------------|:---------------|:--------------|:---------------|:-------------|:-------------|:----------|:--------------|:--------|:-------------------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/chkalov_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T07:10:51+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T07:16:19+00:00 |
a4068bfca8c1e777595ce1b4f4b3d5460cef2130 |
# Dataset Card for Evaluation run of FelixChao/NinjaDolphin-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/NinjaDolphin-7B](https://huggingface.co/FelixChao/NinjaDolphin-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__NinjaDolphin-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T07:09:51.567777](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__NinjaDolphin-7B/blob/main/results_2024-01-14T07-09-51.567777.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6473574572689726,
"acc_stderr": 0.03204891578067438,
"acc_norm": 0.6480292804481895,
"acc_norm_stderr": 0.03270235131918203,
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088376,
"mc2": 0.5494303013121583,
"mc2_stderr": 0.015522294140989212
},
"harness|arc:challenge|25": {
"acc": 0.6186006825938567,
"acc_stderr": 0.014194389086685247,
"acc_norm": 0.6561433447098977,
"acc_norm_stderr": 0.013880644570156218
},
"harness|hellaswag|10": {
"acc": 0.6649073889663414,
"acc_stderr": 0.0047105814966393374,
"acc_norm": 0.8535152360087632,
"acc_norm_stderr": 0.0035286889976580537
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268542,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268542
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501562,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501562
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854052,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02675640153807897,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02675640153807897
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389104,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389104
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.03036037971029195,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.03036037971029195
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608313,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608313
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.01611523550486547,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.01611523550486547
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959614,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959614
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897227,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897227
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399683,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399683
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088376,
"mc2": 0.5494303013121583,
"mc2_stderr": 0.015522294140989212
},
"harness|winogrande|5": {
"acc": 0.8026835043409629,
"acc_stderr": 0.011185026389050374
},
"harness|gsm8k|5": {
"acc": 0.6785443517816527,
"acc_stderr": 0.012864471384836703
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_FelixChao__NinjaDolphin-7B | [
"region:us"
] | 2024-01-14T07:12:07+00:00 | {"pretty_name": "Evaluation run of FelixChao/NinjaDolphin-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/NinjaDolphin-7B](https://huggingface.co/FelixChao/NinjaDolphin-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__NinjaDolphin-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T07:09:51.567777](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__NinjaDolphin-7B/blob/main/results_2024-01-14T07-09-51.567777.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6473574572689726,\n \"acc_stderr\": 0.03204891578067438,\n \"acc_norm\": 0.6480292804481895,\n \"acc_norm_stderr\": 0.03270235131918203,\n \"mc1\": 0.3659730722154223,\n \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.5494303013121583,\n \"mc2_stderr\": 0.015522294140989212\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6186006825938567,\n \"acc_stderr\": 0.014194389086685247,\n \"acc_norm\": 0.6561433447098977,\n \"acc_norm_stderr\": 0.013880644570156218\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6649073889663414,\n \"acc_stderr\": 0.0047105814966393374,\n \"acc_norm\": 0.8535152360087632,\n \"acc_norm_stderr\": 0.0035286889976580537\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268542,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268542\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501562,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501562\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02675640153807897,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02675640153807897\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389104,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389104\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.03036037971029195,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.03036037971029195\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608313,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608313\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n \"acc_stderr\": 0.01611523550486547,\n \"acc_norm\": 0.3664804469273743,\n \"acc_norm_stderr\": 0.01611523550486547\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959614,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959614\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.012741974333897227,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.012741974333897227\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399683,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399683\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.5494303013121583,\n \"mc2_stderr\": 0.015522294140989212\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8026835043409629,\n \"acc_stderr\": 0.011185026389050374\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6785443517816527,\n \"acc_stderr\": 0.012864471384836703\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/NinjaDolphin-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|arc:challenge|25_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|gsm8k|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hellaswag|10_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|winogrande|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["results_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T07-09-51.567777.parquet"]}]}]} | 2024-01-14T07:12:28+00:00 |
688a17d46e8e33c0cf369c043f0552a0077ec5d4 | SumitMdhr/ASR | [
"region:us"
] | 2024-01-14T07:12:43+00:00 | {} | 2024-02-04T02:41:57+00:00 |
|
d2e7f5c27b7d5058a0e4dc638e2a3b68adadd10c |
# Dataset Card for Evaluation run of jan-hq/stealth-v1.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jan-hq/stealth-v1.2](https://huggingface.co/jan-hq/stealth-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jan-hq__stealth-v1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T07:26:57.769050](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__stealth-v1.2/blob/main/results_2024-01-14T07-26-57.769050.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.647630716246248,
"acc_stderr": 0.032067729447727726,
"acc_norm": 0.64733509976457,
"acc_norm_stderr": 0.032732372289814314,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5422529201207911,
"mc2_stderr": 0.01526615960034381
},
"harness|arc:challenge|25": {
"acc": 0.6331058020477816,
"acc_stderr": 0.014084133118104296,
"acc_norm": 0.6638225255972696,
"acc_norm_stderr": 0.013804855026205761
},
"harness|hellaswag|10": {
"acc": 0.6748655646285601,
"acc_stderr": 0.004674677287148613,
"acc_norm": 0.8613821947819159,
"acc_norm_stderr": 0.003448410595239921
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335075,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335075
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083018,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507337,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507337
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741622,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741622
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40670391061452515,
"acc_stderr": 0.01642881191589886,
"acc_norm": 0.40670391061452515,
"acc_norm_stderr": 0.01642881191589886
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042114,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653354,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653354
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5422529201207911,
"mc2_stderr": 0.01526615960034381
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.011082538847491904
},
"harness|gsm8k|5": {
"acc": 0.7225170583775588,
"acc_stderr": 0.012333447581047525
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jan-hq__stealth-v1.2 | [
"region:us"
] | 2024-01-14T07:29:20+00:00 | {"pretty_name": "Evaluation run of jan-hq/stealth-v1.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [jan-hq/stealth-v1.2](https://huggingface.co/jan-hq/stealth-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jan-hq__stealth-v1.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T07:26:57.769050](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__stealth-v1.2/blob/main/results_2024-01-14T07-26-57.769050.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.647630716246248,\n \"acc_stderr\": 0.032067729447727726,\n \"acc_norm\": 0.64733509976457,\n \"acc_norm_stderr\": 0.032732372289814314,\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5422529201207911,\n \"mc2_stderr\": 0.01526615960034381\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6331058020477816,\n \"acc_stderr\": 0.014084133118104296,\n \"acc_norm\": 0.6638225255972696,\n \"acc_norm_stderr\": 0.013804855026205761\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6748655646285601,\n \"acc_stderr\": 0.004674677287148613,\n \"acc_norm\": 0.8613821947819159,\n \"acc_norm_stderr\": 0.003448410595239921\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335075,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335075\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083018,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083018\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507337,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507337\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n \"acc_stderr\": 0.013223928616741622,\n \"acc_norm\": 0.8365261813537676,\n \"acc_norm_stderr\": 0.013223928616741622\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.02317629820399201,\n \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.02317629820399201\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40670391061452515,\n \"acc_stderr\": 0.01642881191589886,\n \"acc_norm\": 0.40670391061452515,\n \"acc_norm_stderr\": 0.01642881191589886\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042114,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653354,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653354\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5422529201207911,\n \"mc2_stderr\": 0.01526615960034381\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.011082538847491904\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7225170583775588,\n \"acc_stderr\": 0.012333447581047525\n }\n}\n```", "repo_url": "https://huggingface.co/jan-hq/stealth-v1.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|arc:challenge|25_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|gsm8k|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hellaswag|10_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|winogrande|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["results_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T07-26-57.769050.parquet"]}]}]} | 2024-01-14T07:29:39+00:00 |
0398aa0d6dad0e29d4e2047005b11a58d54c7f0d |
# Dataset of cavalla/カヴァラ/棘鳍 (Azur Lane)
This is the dataset of cavalla/カヴァラ/棘鳍 (Azur Lane), containing 13 images and their tags.
The core tags of this character are `long_hair, blonde_hair, ponytail, breasts, small_breasts, bangs, fang`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 22.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cavalla_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 11.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cavalla_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 32 | 23.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cavalla_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 18.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cavalla_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 32 | 34.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cavalla_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/cavalla_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, looking_at_viewer, smile, open_mouth, blush, solo, bare_shoulders, holding |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | open_mouth | blush | solo | bare_shoulders | holding |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------------|:--------|:-------|:-----------------|:----------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X |
| CyberHarem/cavalla_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T07:31:04+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T07:37:20+00:00 |
8988a7f752da125d9409d5d4f36ad8af605b597d |
# Dataset Card for Evaluation run of jan-hq/stealth-v1.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jan-hq/stealth-v1.3](https://huggingface.co/jan-hq/stealth-v1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jan-hq__stealth-v1.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T07:33:07.818995](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__stealth-v1.3/blob/main/results_2024-01-14T07-33-07.818995.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6295492975801164,
"acc_stderr": 0.032574778382655614,
"acc_norm": 0.631127878406581,
"acc_norm_stderr": 0.033231725904867095,
"mc1": 0.4173806609547124,
"mc1_stderr": 0.017262891063272178,
"mc2": 0.591209574646901,
"mc2_stderr": 0.015611059031702696
},
"harness|arc:challenge|25": {
"acc": 0.6160409556313993,
"acc_stderr": 0.01421244498065189,
"acc_norm": 0.6518771331058021,
"acc_norm_stderr": 0.013921008595179342
},
"harness|hellaswag|10": {
"acc": 0.6489743079067914,
"acc_stderr": 0.004763155068744876,
"acc_norm": 0.844353714399522,
"acc_norm_stderr": 0.003617787934747749
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121427,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121427
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.02439667298509476,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.02439667298509476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671631,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233504,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233504
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794086,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794086
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876164,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876164
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3675977653631285,
"acc_stderr": 0.016125543823552947,
"acc_norm": 0.3675977653631285,
"acc_norm_stderr": 0.016125543823552947
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.026787453111906504,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.026787453111906504
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.025171041915309684,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.025171041915309684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284066,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284066
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4745762711864407,
"acc_stderr": 0.012753716929101004,
"acc_norm": 0.4745762711864407,
"acc_norm_stderr": 0.012753716929101004
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.01954210156485412,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.01954210156485412
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249772,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249772
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4173806609547124,
"mc1_stderr": 0.017262891063272178,
"mc2": 0.591209574646901,
"mc2_stderr": 0.015611059031702696
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090254
},
"harness|gsm8k|5": {
"acc": 0.6110689916603488,
"acc_stderr": 0.013428382481274252
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jan-hq__stealth-v1.3 | [
"region:us"
] | 2024-01-14T07:35:31+00:00 | {"pretty_name": "Evaluation run of jan-hq/stealth-v1.3", "dataset_summary": "Dataset automatically created during the evaluation run of model [jan-hq/stealth-v1.3](https://huggingface.co/jan-hq/stealth-v1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jan-hq__stealth-v1.3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T07:33:07.818995](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__stealth-v1.3/blob/main/results_2024-01-14T07-33-07.818995.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6295492975801164,\n \"acc_stderr\": 0.032574778382655614,\n \"acc_norm\": 0.631127878406581,\n \"acc_norm_stderr\": 0.033231725904867095,\n \"mc1\": 0.4173806609547124,\n \"mc1_stderr\": 0.017262891063272178,\n \"mc2\": 0.591209574646901,\n \"mc2_stderr\": 0.015611059031702696\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6160409556313993,\n \"acc_stderr\": 0.01421244498065189,\n \"acc_norm\": 0.6518771331058021,\n \"acc_norm_stderr\": 0.013921008595179342\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6489743079067914,\n \"acc_stderr\": 0.004763155068744876,\n \"acc_norm\": 0.844353714399522,\n \"acc_norm_stderr\": 0.003617787934747749\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.02439667298509476,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.02439667298509476\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671631,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671631\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233504,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233504\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794086,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794086\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.013890862162876164,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.013890862162876164\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3675977653631285,\n \"acc_stderr\": 0.016125543823552947,\n \"acc_norm\": 0.3675977653631285,\n \"acc_norm_stderr\": 0.016125543823552947\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906504,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906504\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.025171041915309684,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.025171041915309684\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284066,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284066\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4745762711864407,\n \"acc_stderr\": 0.012753716929101004,\n \"acc_norm\": 0.4745762711864407,\n \"acc_norm_stderr\": 0.012753716929101004\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6290849673202614,\n \"acc_stderr\": 0.01954210156485412,\n \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.01954210156485412\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249772,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249772\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4173806609547124,\n \"mc1_stderr\": 0.017262891063272178,\n \"mc2\": 0.591209574646901,\n \"mc2_stderr\": 0.015611059031702696\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090254\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6110689916603488,\n \"acc_stderr\": 0.013428382481274252\n }\n}\n```", "repo_url": "https://huggingface.co/jan-hq/stealth-v1.3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|arc:challenge|25_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|gsm8k|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hellaswag|10_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|winogrande|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["results_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T07-33-07.818995.parquet"]}]}]} | 2024-01-14T07:35:55+00:00 |
b46d603d5e61be6b279806bfe4fce69dea17bf07 |
# Dataset of kako/加古/加古 (Azur Lane)
This is the dataset of kako/加古/加古 (Azur Lane), containing 12 images and their tags.
The core tags of this character are `braid, brown_hair, long_hair, glasses, semi-rimless_eyewear, twin_braids, under-rim_eyewear, red-framed_eyewear, animal_ears, breasts, large_breasts, aqua_eyes, bangs, between_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 8.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kako_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 7.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kako_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 25 | 13.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kako_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 8.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kako_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 25 | 14.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kako_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kako_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | sailor_collar, 1girl, solo, pleated_skirt, crop_top, detached_sleeves, looking_at_viewer, neckerchief, retrofit_(azur_lane), black_skirt, midriff, closed_mouth, sleeveless_shirt, white_gloves, white_thighhighs, wide_sleeves, blush, miniskirt, navel, adjusting_eyewear, bare_shoulders, serafuku, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | sailor_collar | 1girl | solo | pleated_skirt | crop_top | detached_sleeves | looking_at_viewer | neckerchief | retrofit_(azur_lane) | black_skirt | midriff | closed_mouth | sleeveless_shirt | white_gloves | white_thighhighs | wide_sleeves | blush | miniskirt | navel | adjusting_eyewear | bare_shoulders | serafuku | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------|:--------|:-------|:----------------|:-----------|:-------------------|:--------------------|:--------------|:-----------------------|:--------------|:----------|:---------------|:-------------------|:---------------|:-------------------|:---------------|:--------|:------------|:--------|:--------------------|:-----------------|:-----------|:--------------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/kako_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T07:35:56+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T07:38:49+00:00 |
33de20f20c698bd85c07c9ace9553c5a33ae7ff6 | mayank200456789/Viva | [
"license:mit",
"region:us"
] | 2024-01-14T07:44:35+00:00 | {"license": "mit"} | 2024-01-14T07:51:08+00:00 |
|
12b0953f0bedba426e52926723a387d4118155bf |
# A kernel function which improves the accuracy and interpretability of large ensembles of neural networks
We describe a new kernel (i.e. similarity function between pairs of examples) which is computed using an ensemble of neural networks. It has the following properties:
- Using it to predict test labels (via k-nearest neighbors across the training set) yields even higher accuracy than the standard ensemble inference method
of averaging predictions, once the number of networks exceeds about 100. We believe this kernel + k-NN method is the state-of-the-art for inferencing large ensembles
(although such ensembles are rarely used in practice).
- Being a similarity function, it is highly interpretable. For each test example, it allows us to visualize training examples which are deemed to have
similar features by the training process, with much greater fidelity than e.g. penultimate layer embeddings. For instance, we use this to identify the (known) fact that
~10% of the CIFAR-10 test-set examples have a near-duplicate in the training set, and to identify a failure mode.
To compute the kernel for an ensemble of n=500 models, we provide the following simple code (which can be copy-paste run in your environment).
```
import torch
import torchvision
import huggingface_hub
def normalize(logits):
logits = logits.float()
logits = logits.log_softmax(-1)
logits = (logits - logits.mean(0, keepdim=True)) / logits.std(0, keepdim=True)
return logits
def compute_kernel(logits1, logits2):
logits1 = normalize(logits1)
logits2 = normalize(logits2)
assert len(logits1) == len(logits2)
kernel = torch.zeros(logits1.shape[1], logits2.shape[1]).cuda()
for c in range(10):
logits1_cls = logits1[..., c].cuda()
logits2_cls = logits2[..., c].cuda()
corr_cls = (logits1_cls.T @ logits2_cls) / len(logits1)
kernel += corr_cls / 10
return kernel
######################################################################################
# Setup: Download CIFAR-10 labels and the outputs from 500 repeated training runs. #
######################################################################################
labels_train = torch.tensor(torchvision.datasets.CIFAR10('cifar10', train=True).targets)
labels_test = torch.tensor(torchvision.datasets.CIFAR10('cifar10', train=False).targets)
api = huggingface_hub.HfApi()
fname = 'logs_saveoutputs_main/06109e85-f5d7-4ac8-b0b0-f03542f23234/log.pt'
obj_path = api.hf_hub_download('kjj0/cifar10-multirun-logits', repo_type='dataset',
filename=fname)
obj = torch.load(obj_path, map_location='cpu')
# print(obj['code']) # Uncomment if you want to see the training code
######################################################################################
# Evaluate both the per-model and ensembled accuracy of the training outputs. #
######################################################################################
each_acc = (obj['logits'].argmax(-1) == labels_test).float().mean(1)
avg_acc = each_acc.mean()
print('average single-model accuracy \t: %.2f' % (100 * avg_acc))
ens_pred = obj['logits'].mean(0).argmax(1)
ens_acc = (ens_pred == labels_test).float().mean()
print('ensemble accuracy (%d models) \t: %.2f' % (len(obj['logits']), 100 * ens_acc))
# (n.b. averaging probabilities instead of logits makes no difference)
######################################################################################
# Evaluate the new kernel / ensemble inference method. #
######################################################################################
# use correlations between log_softmax outputs as a similarity metric for k-NN inference.
kernel = compute_kernel(obj['logits'], obj['logits_train'])
k = 3
nbrs = kernel.topk(k, dim=1)
nbr_labels = labels_train[nbrs.indices.cpu()]
pred = nbr_labels.mode(1).values
acc = (pred == labels_test).float().mean()
print('kernel accuracy (k-NN w/ k=%d) \t: %.2f' % (k, 100 * acc))
## average single-model accuracy : 93.26
## ensemble accuracy (500 models) : 94.69
## kernel accuracy (k-NN w/ k=3) : 95.01
```
The training configuration we used to generate these 500 models (i.e. the script that we re-ran 500 times with different random seeds) yields a mean accuracy of 93.26%.
If we average the predictions across those 500 models, we attain a much improved accuracy of 94.69%.
If we predict the test-set labels using our kernel applied to pairs of (train, test) examples, using k-nearest neighbors with k=3,
then we attain an even higher accuracy of 95.01%.
We include 20,000 total runs of training for the same training configuration that generated the 500 runs used in the above.
The outputs of those runs (i.e. the logits predicted by the final model on the training and test examples) can be found as the other files in `logs_saveoutputs_main`.
If we compute the kernel with all 20,000 runs instead of 500, and use a weighting scheme based on the correlation values,
then the accuracy can be futher increased to 95.53%.
Note that increasing from 500 to 20,000 does not improve the accuracy of the averaged predictions,
so with 95.53% we have reached 0.84% higher than the standard ensemble accuracy.
We additionally include outputs from three other training configurations; their kernels seem to have the same properties.
## Interpretability-type applications
### Finding similar pairs
(Below:) We rank the CIFAR-10 test-set examples by their similarity to their most similar training-set example.
We show the 601th-648th most highly ranked test examples (out of 10,000), along with their matched training examples.
Many of them turn out to be visually similar pairs.

We note that the penultimate-layer features almost entirely lack this property --
if we visualize the most similar pairs across all (test, train) pairs according to distance in penultimate feature space,
we will get not duplicates but instead just random highly confident examples which have all presumably collapsed to a similar point in space.
On the other hand, pairs which are given a high similarity score by our correlation kernel turn out to often be near-duplicates, and this holds true
for the most similar pairs even when we reduce the number of models in the ensemble down to a relatively small value like 10 or 20.
### Diagnosing failure modes
(Below:) We rank the CIFAR-10 test examples by how similar their most similar training-set example is, and then filter for cases where they have different labels.
The first (leftmost) column contains the top 8 such test examples, and then subsequent columns are their 9 nearest neighbors in the training set.
It appears that our network has difficulty seeing small objects.

### Some random examples
(Below:) We select 10 CIFAR-10 test examples at random (the first row), and display their two nearest neighbors according to the kernel (second two rows),
and the penultimate features from a single model (next two rows). The kernel yields images which are perceptually similar, whereas penultimate features
select nearly a random image of the same label.

## Open questions
* The usage of `log_softmax` in the normalization step seems to be important, especially for making the kernel work with n < 1,000 (where n is the number of networks).
But for n -> infty, it becomes less important. Why -- is it somehow removing noise?
* Via the Neural Network Gaussian Process (NNGP) theory, it is possible to compute the expectation of this kernel for untrained / newly initialized networks
(at least if the log-softmax is removed). Is there any general theory for what this kernel becomes after training (i.e., what we are seeing here)?
* This kernel is implemented as a sum of 10 correlation kernels -- one for each class. But upon inspection, each of those has dramatically worse
k-NN accuracy than their sum, at least until n becomes on the order of thousands. Why?
* Removing log-softmax, despite harming the overall accuracy as discussed earlier,
apparently increases the k-NN accuracy (and generally quality) of the individual kernels. Why??
* How does this kernel compare to [TRAK](https://arxiv.org/abs/2303.14186)
or the datamodel embeddings from [https://arxiv.org/abs/2202.00622](https://arxiv.org/abs/2202.00622)?
| kjj0/cifar10-multirun-logits | [
"license:mit",
"arxiv:2303.14186",
"arxiv:2202.00622",
"region:us"
] | 2024-01-14T07:46:15+00:00 | {"license": "mit"} | 2024-01-14T20:54:31+00:00 |
d80a8c712fe1b70935e1230231cdf3d0d7ce2d04 | **Context**
The dataset contains the Hindi and English subtitles for famous YouTube channels. This dataset was mainly created for the Hindi Language channel since the main goal was to use this dataset to build LLMs using the Hindi Language.
Data from channels in Information, Entertainment, Politics, Comedy, News, etc categories has been included in this dataset.
***Dataset Stats:***
- **58 channels**
- **103,042 total videos**
**Content**
- Video subtitles in Hindi and English
- Video metadata like duration, number of comments, likes, counts, published date
**Acknowledgements**
The source of this dataset is YouTube. The following packages were used to generate this dataset:
- [youtube-transcript-api](https://pypi.org/project/youtube-transcript-api/)
- [google-api-python-client](https://pypi.org/project/google-api-python-client/)
**Inspiration**
- Build LLMs model using Hindi
- Finetune models using Hindi for tasks like classification, summarization, translation, etc | pardeep/youtube-vidoes-transcripts-hindi-english | [
"license:odc-by",
"region:us"
] | 2024-01-14T07:47:23+00:00 | {"license": "odc-by"} | 2024-01-20T07:22:28+00:00 |
35134b0621b1c627dbe9fe3157092de25bbff48a |
# Dataset Card for Evaluation run of abideen/NexoNimbus-MoE-2x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abideen/NexoNimbus-MoE-2x7B](https://huggingface.co/abideen/NexoNimbus-MoE-2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abideen__NexoNimbus-MoE-2x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T07:48:44.681252](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__NexoNimbus-MoE-2x7B/blob/main/results_2024-01-14T07-48-44.681252.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6460879831062891,
"acc_stderr": 0.032126633738886544,
"acc_norm": 0.6490547896974873,
"acc_norm_stderr": 0.03277080381411241,
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677157,
"mc2": 0.5305573947385855,
"mc2_stderr": 0.015330669260578354
},
"harness|arc:challenge|25": {
"acc": 0.6228668941979523,
"acc_stderr": 0.014163366896192598,
"acc_norm": 0.6680887372013652,
"acc_norm_stderr": 0.013760988200880536
},
"harness|hellaswag|10": {
"acc": 0.6683927504481179,
"acc_stderr": 0.004698285350019219,
"acc_norm": 0.856602270464051,
"acc_norm_stderr": 0.0034976171082184023
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055273,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055273
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268542,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268542
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276878,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276878
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887048,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887048
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461783,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461783
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903338,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903338
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546835,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.376536312849162,
"acc_stderr": 0.016204672385106603,
"acc_norm": 0.376536312849162,
"acc_norm_stderr": 0.016204672385106603
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47522816166883963,
"acc_stderr": 0.012754553719781753,
"acc_norm": 0.47522816166883963,
"acc_norm_stderr": 0.012754553719781753
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162666,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162666
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677157,
"mc2": 0.5305573947385855,
"mc2_stderr": 0.015330669260578354
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.010905978112156888
},
"harness|gsm8k|5": {
"acc": 0.535253980288097,
"acc_stderr": 0.01373820799017732
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_abideen__NexoNimbus-MoE-2x7B | [
"region:us"
] | 2024-01-14T07:50:59+00:00 | {"pretty_name": "Evaluation run of abideen/NexoNimbus-MoE-2x7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [abideen/NexoNimbus-MoE-2x7B](https://huggingface.co/abideen/NexoNimbus-MoE-2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abideen__NexoNimbus-MoE-2x7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T07:48:44.681252](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__NexoNimbus-MoE-2x7B/blob/main/results_2024-01-14T07-48-44.681252.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6460879831062891,\n \"acc_stderr\": 0.032126633738886544,\n \"acc_norm\": 0.6490547896974873,\n \"acc_norm_stderr\": 0.03277080381411241,\n \"mc1\": 0.3598531211750306,\n \"mc1_stderr\": 0.016801860466677157,\n \"mc2\": 0.5305573947385855,\n \"mc2_stderr\": 0.015330669260578354\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6228668941979523,\n \"acc_stderr\": 0.014163366896192598,\n \"acc_norm\": 0.6680887372013652,\n \"acc_norm_stderr\": 0.013760988200880536\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6683927504481179,\n \"acc_stderr\": 0.004698285350019219,\n \"acc_norm\": 0.856602270464051,\n \"acc_norm_stderr\": 0.0034976171082184023\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055273,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055273\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268542,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268542\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047709,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047709\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276878,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276878\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887048,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887048\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461783,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461783\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.033981108902946366,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.033981108902946366\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903338,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903338\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546835,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546835\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n \"acc_stderr\": 0.016204672385106603,\n \"acc_norm\": 0.376536312849162,\n \"acc_norm_stderr\": 0.016204672385106603\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n \"acc_stderr\": 0.012754553719781753,\n \"acc_norm\": 0.47522816166883963,\n \"acc_norm_stderr\": 0.012754553719781753\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162666,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162666\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n \"mc1_stderr\": 0.016801860466677157,\n \"mc2\": 0.5305573947385855,\n \"mc2_stderr\": 0.015330669260578354\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.010905978112156888\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.535253980288097,\n \"acc_stderr\": 0.01373820799017732\n }\n}\n```", "repo_url": "https://huggingface.co/abideen/NexoNimbus-MoE-2x7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|arc:challenge|25_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|gsm8k|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hellaswag|10_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T07-48-44.681252.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["**/details_harness|winogrande|5_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T07-48-44.681252.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T07_48_44.681252", "path": ["results_2024-01-14T07-48-44.681252.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T07-48-44.681252.parquet"]}]}]} | 2024-01-14T07:51:19+00:00 |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.