sha
stringlengths 40
40
| text
stringlengths 0
13.4M
| id
stringlengths 2
117
| tags
sequence | created_at
stringlengths 25
25
| metadata
stringlengths 2
31.7M
| last_modified
stringlengths 25
25
|
---|---|---|---|---|---|---|
2f26c813ce505b067ebc395b9c67de15e1d18e8d |
# Dataset of hatakaze/旗風/旗风 (Azur Lane)
This is the dataset of hatakaze/旗風/旗风 (Azur Lane), containing 22 images and their tags.
The core tags of this character are `glasses, animal_ears, long_hair, yellow_eyes, round_eyewear, very_long_hair, twintails, braid, tail, breasts, fox_ears`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 22 | 34.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatakaze_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 22 | 18.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatakaze_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 58 | 41.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatakaze_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 22 | 30.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatakaze_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 58 | 59.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatakaze_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hatakaze_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, solo, looking_at_viewer, wide_sleeves, black_kimono, leaf, long_sleeves, hakama_short_skirt, holding, pleated_skirt, sleeves_past_wrists |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | wide_sleeves | black_kimono | leaf | long_sleeves | hakama_short_skirt | holding | pleated_skirt | sleeves_past_wrists |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:---------------|:---------------|:-------|:---------------|:---------------------|:----------|:----------------|:----------------------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/hatakaze_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T07:52:04+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T07:57:54+00:00 |
bd14b91f1fd569fdd75488533afc9b4e52f4df3c |
# Dataset of wichita/ウィチタ/威奇塔 (Azur Lane)
This is the dataset of wichita/ウィチタ/威奇塔 (Azur Lane), containing 25 images and their tags.
The core tags of this character are `breasts, long_hair, red_hair, red_eyes, large_breasts, bangs, ponytail, hair_between_eyes, ahoge, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 25 | 32.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wichita_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 25 | 18.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wichita_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 59 | 37.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wichita_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 25 | 28.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wichita_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 59 | 51.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wichita_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/wichita_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, black_gloves, cleavage, white_shirt, belt, black_pants, looking_at_viewer, red_rose, black_choker, black_jacket, earrings, solo, collared_shirt, formal, holding_cup, navel, black_footwear, blue_jacket, cannon, crossed_arms, full_body, grin, high_heels, indoors, night, official_alternate_costume, open_clothes, partially_unbuttoned, standing, wine_glass |
| 1 | 11 |  |  |  |  |  | 1girl, cleavage, jacket_on_shoulders, navel, solo, thighhighs, white_gloves, collarbone, epaulettes, garter_straps, hair_ribbon, looking_at_viewer, midriff, simple_background, miniskirt, open_mouth, black_bra, holding, medium_breasts, military_uniform, stomach, very_long_hair, white_background, white_jacket, white_skirt, :d, armpits, blush, bow, bra_peek |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | cleavage | white_shirt | belt | black_pants | looking_at_viewer | red_rose | black_choker | black_jacket | earrings | solo | collared_shirt | formal | holding_cup | navel | black_footwear | blue_jacket | cannon | crossed_arms | full_body | grin | high_heels | indoors | night | official_alternate_costume | open_clothes | partially_unbuttoned | standing | wine_glass | jacket_on_shoulders | thighhighs | white_gloves | collarbone | epaulettes | garter_straps | hair_ribbon | midriff | simple_background | miniskirt | open_mouth | black_bra | holding | medium_breasts | military_uniform | stomach | very_long_hair | white_background | white_jacket | white_skirt | :d | armpits | blush | bow | bra_peek |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------|:--------------|:-------|:--------------|:--------------------|:-----------|:---------------|:---------------|:-----------|:-------|:-----------------|:---------|:--------------|:--------|:-----------------|:--------------|:---------|:---------------|:------------|:-------|:-------------|:----------|:--------|:-----------------------------|:---------------|:-----------------------|:-----------|:-------------|:----------------------|:-------------|:---------------|:-------------|:-------------|:----------------|:--------------|:----------|:--------------------|:------------|:-------------|:------------|:----------|:-----------------|:-------------------|:----------|:-----------------|:-------------------|:---------------|:--------------|:-----|:----------|:--------|:------|:-----------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | | X | | | | X | | | | | X | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/wichita_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T07:52:10+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T07:59:08+00:00 |
133ce9980621e096d5dc40d1f565c1d233bb383e | lvdthieu/solfile | [
"license:mit",
"region:us"
] | 2024-01-14T08:01:45+00:00 | {"license": "mit"} | 2024-01-14T08:17:27+00:00 |
|
354ef33e9d68dda66f126443fbee07aee6957944 |
# Dataset of l_indomptable/ランドンターブル/不屈 (Azur Lane)
This is the dataset of l_indomptable/ランドンターブル/不屈 (Azur Lane), containing 24 images and their tags.
The core tags of this character are `blue_eyes, long_hair, multicolored_hair, white_hair, breasts, hair_bun, very_long_hair, black_hair, double_bun, bangs, hair_between_eyes, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 50.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_indomptable_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 23.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_indomptable_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 60 | 51.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_indomptable_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 43.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_indomptable_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 60 | 82.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_indomptable_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/l_indomptable_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, open_mouth, simple_background, white_background, white_dress, white_pantyhose, gradient_hair, twintails |
| 1 | 6 |  |  |  |  |  | 1girl, navel, blush, looking_at_viewer, solo, nipples, pussy, barefoot, on_back, spread_legs, underwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | open_mouth | simple_background | white_background | white_dress | white_pantyhose | gradient_hair | twintails | navel | nipples | pussy | barefoot | on_back | spread_legs | underwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:-------------|:--------------------|:-------------------|:--------------|:------------------|:----------------|:------------|:--------|:----------|:--------|:-----------|:----------|:--------------|:------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | | | | | | | X | X | X | X | X | X | X |
| CyberHarem/l_indomptable_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T08:05:29+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T08:13:43+00:00 |
7be22db1f5205bf1c3f731e11eb6bc93c2a5da72 | UmairT/scholarships_dataset | [
"region:us"
] | 2024-01-14T08:25:15+00:00 | {} | 2024-01-14T08:26:42+00:00 |
|
ef83382d2a8eb8856c20efa772dba94059379916 | royal4545/mammootty | [
"region:us"
] | 2024-01-14T08:29:08+00:00 | {} | 2024-01-14T08:38:19+00:00 |
|
98d15ce0fe02ad779bd2a04467c9caaf89fc6b72 |
# Dataset of jersey/ジャージー/泽西 (Azur Lane)
This is the dataset of jersey/ジャージー/泽西 (Azur Lane), containing 13 images and their tags.
The core tags of this character are `long_hair, red_hair, twintails, bangs, breasts, low_twintails, very_long_hair, yellow_eyes, ahoge, antenna_hair, bow, brown_eyes, crown, hair_ornament, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 21.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jersey_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 11.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jersey_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 36 | 27.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jersey_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 18.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jersey_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 36 | 39.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jersey_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jersey_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, blush, solo, looking_at_viewer, open_mouth, detached_sleeves, bare_shoulders, dress, sleeveless |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | solo | looking_at_viewer | open_mouth | detached_sleeves | bare_shoulders | dress | sleeveless |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------------------|:-------------|:-------------------|:-----------------|:--------|:-------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
| CyberHarem/jersey_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T08:32:24+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T08:37:25+00:00 |
fc5bb41d385306a5931727202251189e5ebb25a2 |
# Dataset of u_96/U-96 (Azur Lane)
This is the dataset of u_96/U-96 (Azur Lane), containing 14 images and their tags.
The core tags of this character are `breasts, yellow_eyes, bangs, small_breasts, twintails, goggles_on_head, white_hair, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 14 | 18.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_96_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 14 | 10.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_96_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 32 | 24.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_96_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 14 | 15.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_96_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 32 | 31.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_96_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/u_96_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, solo, looking_at_viewer, goggles, open_mouth, smile, open_jacket, white_one-piece_swimsuit, blue_jacket, yellow_gloves, bubble, holding, hood_down |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | goggles | open_mouth | smile | open_jacket | white_one-piece_swimsuit | blue_jacket | yellow_gloves | bubble | holding | hood_down |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:----------|:-------------|:--------|:--------------|:---------------------------|:--------------|:----------------|:---------|:----------|:------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/u_96_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T08:32:43+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T08:36:38+00:00 |
27c4b579fe83512fad94577b5297bb5b39ce0b3c |
# Dataset of sao_martinho/サン・マルチーニョ/圣马丁号 (Azur Lane)
This is the dataset of sao_martinho/サン・マルチーニョ/圣马丁号 (Azur Lane), containing 14 images and their tags.
The core tags of this character are `breasts, large_breasts, long_hair, red_eyes, white_hair, very_long_hair, bangs, dark-skinned_female, dark_skin, wings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 14 | 30.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sao_martinho_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 14 | 14.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sao_martinho_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 35 | 30.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sao_martinho_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 14 | 25.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sao_martinho_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 35 | 47.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sao_martinho_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sao_martinho_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, looking_at_viewer, solo, bare_shoulders, center_opening, blush, navel, white_dress, cleavage, smile, breast_curtains |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | bare_shoulders | center_opening | blush | navel | white_dress | cleavage | smile | breast_curtains |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------------|:-----------------|:--------|:--------|:--------------|:-----------|:--------|:------------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/sao_martinho_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T08:32:58+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T08:36:44+00:00 |
bb364c6e952b9e72e2742480559b65b45c10d671 | xzuyn/example-axolotl-completion | [
"task_categories:text-generation",
"size_categories:n<1K",
"language:en",
"region:us"
] | 2024-01-14T08:45:00+00:00 | {"language": ["en"], "size_categories": ["n<1K"], "task_categories": ["text-generation"], "pretty_name": "Example Axolotl Completion"} | 2024-01-14T09:03:53+00:00 |
|
3cea288bbdadeb997a673ad0ce40e0a7e94d038b | shermin/guanaco-llama2-1k | [
"region:us"
] | 2024-01-14T08:47:35+00:00 | {} | 2024-01-14T08:47:35+00:00 |
|
c8613764eb1916b488c85672a645cf50c01e7f90 |
# Dataset of kersaint/ケルサン/凯尔圣 (Azur Lane)
This is the dataset of kersaint/ケルサン/凯尔圣 (Azur Lane), containing 30 images and their tags.
The core tags of this character are `breasts, long_hair, red_eyes, bangs, blonde_hair, large_breasts, very_long_hair, twintails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 30 | 66.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kersaint_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 30 | 29.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kersaint_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 80 | 64.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kersaint_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 30 | 54.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kersaint_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 80 | 106.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kersaint_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kersaint_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | cleavage, looking_at_viewer, 1girl, pantyhose, solo, veil, white_dress, long_sleeves, simple_background, white_background, blush, closed_mouth, covered_navel, hair_between_eyes, pelvic_curtain, clothing_cutout, black_footwear, bodystocking, detached_sleeves, high_heels, thigh_strap |
| 1 | 10 |  |  |  |  |  | 1girl, cleavage, blush, midriff, navel, visor_cap, black_jacket, solo, sweat, looking_at_viewer, open_jacket, white_sports_bra, tight_pants, water_bottle, armpits, collarbone, holding_bottle, open_mouth, sunglasses, white_pants, bare_shoulders, black_footwear, long_sleeves, parted_lips, sneakers, stomach, yoga_pants |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | cleavage | looking_at_viewer | 1girl | pantyhose | solo | veil | white_dress | long_sleeves | simple_background | white_background | blush | closed_mouth | covered_navel | hair_between_eyes | pelvic_curtain | clothing_cutout | black_footwear | bodystocking | detached_sleeves | high_heels | thigh_strap | midriff | navel | visor_cap | black_jacket | sweat | open_jacket | white_sports_bra | tight_pants | water_bottle | armpits | collarbone | holding_bottle | open_mouth | sunglasses | white_pants | bare_shoulders | parted_lips | sneakers | stomach | yoga_pants |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------|:--------------------|:--------|:------------|:-------|:-------|:--------------|:---------------|:--------------------|:-------------------|:--------|:---------------|:----------------|:--------------------|:-----------------|:------------------|:-----------------|:---------------|:-------------------|:-------------|:--------------|:----------|:--------|:------------|:---------------|:--------|:--------------|:-------------------|:--------------|:---------------|:----------|:-------------|:-----------------|:-------------|:-------------|:--------------|:-----------------|:--------------|:-----------|:----------|:-------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | | X | | | X | | | X | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/kersaint_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T08:50:25+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T08:59:06+00:00 |
920515c73ef9730cf0f10d41eed4dab0bc52ade2 |
# Dataset of abukuma/阿武隈/阿武隈 (Azur Lane)
This is the dataset of abukuma/阿武隈/阿武隈 (Azur Lane), containing 16 images and their tags.
The core tags of this character are `bangs, black_hair, hair_ornament, horns, red_eyes, ahoge, hairclip, pointy_ears, breasts, earrings, facial_mark, fang, hair_between_eyes, short_hair, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 16 | 14.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/abukuma_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 16 | 12.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/abukuma_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 39 | 24.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/abukuma_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 16 | 13.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/abukuma_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 39 | 27.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/abukuma_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/abukuma_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, solo, bare_shoulders, jewelry, looking_at_viewer, sideboob, smile, thighhighs, bell, closed_mouth, cloud, detached_sleeves, medium_breasts, open_mouth, outdoors, shirt, shorts, sky |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | bare_shoulders | jewelry | looking_at_viewer | sideboob | smile | thighhighs | bell | closed_mouth | cloud | detached_sleeves | medium_breasts | open_mouth | outdoors | shirt | shorts | sky |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:----------|:--------------------|:-----------|:--------|:-------------|:-------|:---------------|:--------|:-------------------|:-----------------|:-------------|:-----------|:--------|:---------|:------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/abukuma_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T08:50:31+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T08:55:09+00:00 |
35c2a32cb63842cb68ecf7e81cdfc561abf3cc31 | lithium0003/findtextCenterNet_dataset | [
"license:mit",
"region:us"
] | 2024-01-14T08:58:51+00:00 | {"license": "mit"} | 2024-01-17T15:55:21+00:00 |
|
2fa5d26bd06febe33738c11a868845e23a76c84b |
# Dataset of hardy/ハーディ/勇敢 (Azur Lane)
This is the dataset of hardy/ハーディ/勇敢 (Azur Lane), containing 22 images and their tags.
The core tags of this character are `blonde_hair, hat, blue_eyes, short_hair, ribbon, bangs, braid, hair_bun`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 22 | 27.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hardy_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 22 | 16.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hardy_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 54 | 35.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hardy_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 22 | 24.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hardy_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 54 | 48.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hardy_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hardy_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | white_gloves, 1girl, cape, solo, rapier, epaulettes, looking_at_viewer, holding_sword, simple_background, boots, uniform, white_background, full_body, open_mouth, pleated_skirt, saber_(weapon), smile, torpedo_tubes, white_skirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | white_gloves | 1girl | cape | solo | rapier | epaulettes | looking_at_viewer | holding_sword | simple_background | boots | uniform | white_background | full_body | open_mouth | pleated_skirt | saber_(weapon) | smile | torpedo_tubes | white_skirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------|:--------|:-------|:-------|:---------|:-------------|:--------------------|:----------------|:--------------------|:--------|:----------|:-------------------|:------------|:-------------|:----------------|:-----------------|:--------|:----------------|:--------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/hardy_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T09:09:59+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T09:15:16+00:00 |
335bf8652c986eee9544305c41a19718194c4530 |
# Dataset of archerfish/アーチャーフィッシュ/射水鱼 (Azur Lane)
This is the dataset of archerfish/アーチャーフィッシュ/射水鱼 (Azur Lane), containing 14 images and their tags.
The core tags of this character are `breasts, hair_ornament, long_hair, purple_eyes, blonde_hair, star_hair_ornament, bangs, tail, one_side_up, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 14 | 17.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/archerfish_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 14 | 10.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/archerfish_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 35 | 20.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/archerfish_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 14 | 15.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/archerfish_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 35 | 27.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/archerfish_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/archerfish_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | looking_at_viewer, smile, 1girl, blush, star_(symbol), solo, sitting, skirt, thigh_strap, tanlines, ass, cleavage, looking_back, open_mouth, simple_background, tongue_out |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | smile | 1girl | blush | star_(symbol) | solo | sitting | skirt | thigh_strap | tanlines | ass | cleavage | looking_back | open_mouth | simple_background | tongue_out |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:--------|:--------|:--------|:----------------|:-------|:----------|:--------|:--------------|:-----------|:------|:-----------|:---------------|:-------------|:--------------------|:-------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/archerfish_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T09:10:01+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T09:14:35+00:00 |
5a2ef97dcafcd2b4a2399bf4d947e67daa2e3c7a |
# Dataset of curacoa/キュラソー/库拉索 (Azur Lane)
This is the dataset of curacoa/キュラソー/库拉索 (Azur Lane), containing 17 images and their tags.
The core tags of this character are `long_hair, breasts, maid_headdress, bangs, blue_eyes, large_breasts, brown_hair, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 28.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/curacoa_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 15.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/curacoa_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 39 | 31.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/curacoa_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 24.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/curacoa_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 39 | 45.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/curacoa_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/curacoa_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | looking_at_viewer, smile, 1girl, dress, solo, frills, blush, juliet_sleeves, red_necktie, maid_apron, simple_background, white_apron, open_mouth, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | smile | 1girl | dress | solo | frills | blush | juliet_sleeves | red_necktie | maid_apron | simple_background | white_apron | open_mouth | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:--------|:--------|:--------|:-------|:---------|:--------|:-----------------|:--------------|:-------------|:--------------------|:--------------|:-------------|:-------------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/curacoa_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T09:10:23+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T09:16:54+00:00 |
be81b11ad67201a062a09ebdc4d336e3d397ba00 | akbarali506/legal-data | [
"license:unknown",
"region:us"
] | 2024-01-14T09:15:52+00:00 | {"license": "unknown"} | 2024-01-14T09:22:02+00:00 |
|
32ba79ccef5ffbd4eac5c7302077a62f5e030638 |
# DPO Pairs
This is a preprocessed version of [mlabonne/chatml_dpo_pairs](https://huggingface.co/datasets/mlabonne/chatml_dpo_pairs) using [Bunkatopics](https://github.com/charlesdedampierre/BunkaTopics) to extract meaningful Topics that help models converge with less data.
The objective was to create a smaller dataset than the original but buy keeping its efficiecency.To achieve this, we compared the two datasets used to train the reward model in [mlabonne/chatml_dpo_pairs](https://huggingface.co/datasets/mlabonne/chatml_dpo_pairs): the rejected Llama answers and the accepted ChatGPT answers from the DPO dataset.
We then conducted topic modeling on both datasets, keeping only the topics that existed in the accepted dataset but not in the rejected one. Our hypothesis is that these topics encapsulate the main differences between the two answering styles.
This method allows for quicker convergence with significantly less data (around 1/6 of the initial dataset).
See the page of the model test [here](https://huggingface.co/charlesdedampierre/TopicNeuralHermes-2.5-Mistral-7B)
# Topic Analysis
We applied the topic modeling method to both datasets, extracting 30 topics from each. These topics were characterized using the 10 most specific unigrams or bigrams. We then compared the two sets of topics (30 from each dataset) and retained those in the accepted dataset that shared fewer than 2 terms with any topic in the rejected dataset
We found the 13 distincitve following topics described by 10 terms each:
**Emotional Dynamics**: feelings, Quinn, Austin, minority women, teaching, schools, individual, personality, backgrounds, triggers.
**Global Knowledge Queries**: question, information, geography, news articles, Step, answer, capital city, pipeline system, country, analogy.
**Digital Interactions and Queries**: questions, question, PersonX, modem, answers, effect relationship, Quora, browser, answer, e-commerce.
**Business and Cybersecurity**: email, businesses, initiatives, innovation, advertising papers, spam, breaches, antivirus, payments, prospects.
**Lifestyle and Wellness**: sleep, exercise, gifts, shopping, Casey, stores, stress, headaches, options, mood.
**Wildlife Ecology**: birds, prey, animals, species, infection, nest, eggs, bacteria, insects, kitty condo.
**Environmental Science and Climate**: temperature, gases, greenhouse, emissions, perturbation, sulfur, dioxide, climate change, water, heat.
**Maritime and Mechanical Engineering**: ship, bowling, propulsion, beam width, Filing cabinet, LED, lane, containment area, lawnmower, rotors.
**Cultural and Social Dynamics**: Lindsey, museum, Kate, Rachel, Jason, Alex, Erin, conversation, Laura, exhibits.
**Political Media Analysis**: media platforms, election, politics, teenagers, elections, White House, Barack Obama, nation, Confederate, depression.
**International Relations and Policy**: cooperation, EU, nations, alliance, NATO, European Union, member states, policy, monarch, Brexit.
**Astrophysics and Physical Sciences**: electrons, km, Moon, acceleration, orbit, friction, current, asteroid, electron, collector emitter.
**Film Critique and Analysis**: movie review, film, reviewer, sentiment, critic, flaws, DVD, plot, opinion, originality.
While those topics are not domain-specific, they did not appear right away in the rejected dataset. Further research need to undersand the reason behind the prominence of those topics in the accepted dataset.
# Load Dataset
```python
dataset = load_dataset("bunkalab/topic_based_chatml_dpo_pairs")['train']
``` | bunkalab/topic_based_chatml_dpo_pairs | [
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-01-14T09:27:59+00:00 | {"language": ["en"], "license": "apache-2.0"} | 2024-01-14T15:26:46+00:00 |
ef5c509c3a7e50f79890c989710dcb603a6c4bbc |
# Dataset of exeter/エクセター/埃克塞特 (Azur Lane)
This is the dataset of exeter/エクセター/埃克塞特 (Azur Lane), containing 13 images and their tags.
The core tags of this character are `breasts, green_eyes, brown_hair, long_hair, large_breasts, bangs, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 17.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/exeter_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 11.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/exeter_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 25 | 19.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/exeter_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 16.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/exeter_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 25 | 25.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/exeter_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/exeter_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, cleavage, closed_mouth, standing, white_thighhighs, black_gloves, turret, chain, full_body, hat, single_thighhigh, uneven_legwear, white_background, white_dress, aiguillette, anchor, belt, blush, elbow_gloves, hair_between_eyes, hand_on_hip, machinery, medium_breasts, red_cape, rigging, short_dress, side_slit, simple_background |
| 1 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, solo, black_dress, cleavage, flower, smile, long_sleeves, high_heels, holding_cup, indoors, jewelry, see-through, simple_background, wine_glass |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | solo | cleavage | closed_mouth | standing | white_thighhighs | black_gloves | turret | chain | full_body | hat | single_thighhigh | uneven_legwear | white_background | white_dress | aiguillette | anchor | belt | blush | elbow_gloves | hair_between_eyes | hand_on_hip | machinery | medium_breasts | red_cape | rigging | short_dress | side_slit | simple_background | black_dress | flower | long_sleeves | high_heels | holding_cup | indoors | jewelry | see-through | wine_glass |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:-----------|:---------------|:-----------|:-------------------|:---------------|:---------|:--------|:------------|:------|:-------------------|:-----------------|:-------------------|:--------------|:--------------|:---------|:-------|:--------|:---------------|:--------------------|:--------------|:------------|:-----------------|:-----------|:----------|:--------------|:------------|:--------------------|:--------------|:---------|:---------------|:-------------|:--------------|:----------|:----------|:--------------|:-------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/exeter_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T09:28:57+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T09:32:29+00:00 |
13dbeaafd73668d52fc3f3de35418dc4eefa9797 |
# Dataset of nachi/那智/那智 (Azur Lane)
This is the dataset of nachi/那智/那智 (Azur Lane), containing 10 images and their tags.
The core tags of this character are `bangs, breasts, hat, long_hair, large_breasts, very_long_hair, black_headwear, animal_ears, bow, pink_eyes, peaked_cap, brown_hair, floating_hair, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 20.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nachi_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 10.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nachi_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 21 | 19.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nachi_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 17.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nachi_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 21 | 30.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nachi_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nachi_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, white_shirt, black_thighhighs, blue_skirt, cleavage, pink_bra, closed_mouth, collared_shirt, heart, miniskirt, nail_polish, navel, one_eye_closed, pleated_skirt, midriff, stomach, striped, white_background, blush, bra_peek, collarbone, crop_top, dress_shirt, full_body, garter_straps, simple_background, sleeves_rolled_up, tongue_out, black_footwear, breast_pocket, crossed_legs, hand_on_hip, hand_on_own_thigh, high_heels, jewelry, partially_unbuttoned, school_uniform, sitting, standing, zettai_ryouiki |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | solo | white_shirt | black_thighhighs | blue_skirt | cleavage | pink_bra | closed_mouth | collared_shirt | heart | miniskirt | nail_polish | navel | one_eye_closed | pleated_skirt | midriff | stomach | striped | white_background | blush | bra_peek | collarbone | crop_top | dress_shirt | full_body | garter_straps | simple_background | sleeves_rolled_up | tongue_out | black_footwear | breast_pocket | crossed_legs | hand_on_hip | hand_on_own_thigh | high_heels | jewelry | partially_unbuttoned | school_uniform | sitting | standing | zettai_ryouiki |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:--------------|:-------------------|:-------------|:-----------|:-----------|:---------------|:-----------------|:--------|:------------|:--------------|:--------|:-----------------|:----------------|:----------|:----------|:----------|:-------------------|:--------|:-----------|:-------------|:-----------|:--------------|:------------|:----------------|:--------------------|:--------------------|:-------------|:-----------------|:----------------|:---------------|:--------------|:--------------------|:-------------|:----------|:-----------------------|:-----------------|:----------|:-----------|:-----------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/nachi_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T09:37:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T09:40:30+00:00 |
05ab5e8a21c51c2ae9162b25ac2e1f2827743c6e |
# Dataset of ardent/アーデント/热心 (Azur Lane)
This is the dataset of ardent/アーデント/热心 (Azur Lane), containing 16 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, long_hair, twintails, very_long_hair, breasts, ribbon, bangs, hair_ribbon, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 16 | 17.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ardent_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 16 | 11.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ardent_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 34 | 19.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ardent_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 16 | 15.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ardent_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 34 | 26.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ardent_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ardent_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, open_mouth, full_body, long_sleeves, necktie, pleated_skirt, shoes, smile, white_background, white_shirt, school_uniform, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | blush | open_mouth | full_body | long_sleeves | necktie | pleated_skirt | shoes | smile | white_background | white_shirt | school_uniform | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:-------------|:------------|:---------------|:----------|:----------------|:--------|:--------|:-------------------|:--------------|:-----------------|:-------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ardent_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T09:37:31+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T09:42:19+00:00 |
03bcce50be6b96e75316c2053457c6d1dea63582 | Hiraishin/ujianjpj-tanda-isyarat | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T09:37:35+00:00 | {"license": "apache-2.0"} | 2024-01-14T09:39:53+00:00 |
|
d56d3d9335eca9ff89e413ef3423b9d4a417b2ea | UnderstandLing/oasst1_bn_threads | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T09:41:54+00:00 | {"license": "apache-2.0", "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17371937, "num_examples": 9611}, {"name": "validation", "num_bytes": 854944, "num_examples": 455}], "download_size": 4942448, "dataset_size": 18226881}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T09:45:27+00:00 |
|
2edd044faa9a82bfd105b15c92955257fe9f9273 |
# Dataset of jupiter/ジュピター/丘比特 (Azur Lane)
This is the dataset of jupiter/ジュピター/丘比特 (Azur Lane), containing 15 images and their tags.
The core tags of this character are `long_hair, blue_eyes, hair_ornament, crown, purple_hair, bangs, ahoge, bow, two_side_up, very_long_hair, black_bow, blue_hair, mini_crown, parted_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 17.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jupiter_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 10.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jupiter_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 34 | 22.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jupiter_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 15.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jupiter_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 34 | 31.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jupiter_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jupiter_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_dress, bare_shoulders, blush, sleeveless_dress, white_thighhighs, antenna_hair, simple_background, torpedo, white_background |
| 1 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, solo, twintails, animal_hood, blush, sleeves_past_wrists, torpedo, full_body, hair_bow, holding, hood_up, hooded_jacket, long_sleeves, parted_lips, star_(symbol), stuffed_animal, white_bloomers, barefoot, collarbone, hair_through_headwear, hairclip, leg_warmers, loose_socks, navel, object_hug, open_jacket, rabbit_hair_ornament, ribbon, sitting, slippers, soles, yellow_jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | white_dress | bare_shoulders | blush | sleeveless_dress | white_thighhighs | antenna_hair | simple_background | torpedo | white_background | twintails | animal_hood | sleeves_past_wrists | full_body | hair_bow | holding | hood_up | hooded_jacket | long_sleeves | parted_lips | star_(symbol) | stuffed_animal | white_bloomers | barefoot | collarbone | hair_through_headwear | hairclip | leg_warmers | loose_socks | navel | object_hug | open_jacket | rabbit_hair_ornament | ribbon | sitting | slippers | soles | yellow_jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------------|:-----------------|:--------|:-------------------|:-------------------|:---------------|:--------------------|:----------|:-------------------|:------------|:--------------|:----------------------|:------------|:-----------|:----------|:----------|:----------------|:---------------|:--------------|:----------------|:-----------------|:-----------------|:-----------|:-------------|:------------------------|:-----------|:--------------|:--------------|:--------|:-------------|:--------------|:-----------------------|:---------|:----------|:-----------|:--------|:----------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | | | X | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/jupiter_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T09:56:05+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T09:59:39+00:00 |
8b59cca2767fb4d0e22cb4db0a2be8c2a6f84aa0 |
# Dataset of bailey/ベイリー/贝利 (Azur Lane)
This is the dataset of bailey/ベイリー/贝利 (Azur Lane), containing 24 images and their tags.
The core tags of this character are `long_hair, red_hair, hair_ornament, red_eyes, side_ponytail, antenna_hair, rabbit_ears, bangs, bow, fang`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 21.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bailey_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 16.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bailey_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 52 | 30.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bailey_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 19.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bailey_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 52 | 36.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bailey_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/bailey_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, hat, skirt, blush, food-themed_hair_ornament, open_mouth, stuffed_animal, stuffed_bunny, white_thighhighs, wrist_cuffs, animal_ears, one_eye_closed |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | solo | hat | skirt | blush | food-themed_hair_ornament | open_mouth | stuffed_animal | stuffed_bunny | white_thighhighs | wrist_cuffs | animal_ears | one_eye_closed |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:------|:--------|:--------|:----------------------------|:-------------|:-----------------|:----------------|:-------------------|:--------------|:--------------|:-----------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/bailey_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T09:56:06+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T10:01:32+00:00 |
7eacd951a98e2257c386b7ee67a7e3fd04257fc0 |
# Dataset of l_opiniatre/ルピニャート/倔强 (Azur Lane)
This is the dataset of l_opiniatre/ルピニャート/倔强 (Azur Lane), containing 38 images and their tags.
The core tags of this character are `long_hair, green_eyes, breasts, purple_hair, ahoge, glasses, bangs, very_long_hair, ribbon, bow, small_breasts, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 38 | 48.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_opiniatre_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 38 | 28.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_opiniatre_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 74 | 56.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_opiniatre_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 38 | 42.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_opiniatre_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 74 | 82.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_opiniatre_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/l_opiniatre_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | hair_ribbon, looking_at_viewer, red_ribbon, 1girl, cleavage, blush, purple_gloves, solo, blue_gloves, capelet, cross, semi-rimless_eyewear, white_thighhighs, black_choker |
| 1 | 9 |  |  |  |  |  | bare_shoulders, blush, looking_at_viewer, 1girl, hair_bow, hair_ornament, solo, bridal_garter, choker, frilled_bikini, navel, purple_bikini, red_bow, strapless_bikini, ass, bandeau, nail_polish, smile, stomach, thighs, blue_bikini, cross, earrings, eyewear_removed, groin, side_ponytail |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | hair_ribbon | looking_at_viewer | red_ribbon | 1girl | cleavage | blush | purple_gloves | solo | blue_gloves | capelet | cross | semi-rimless_eyewear | white_thighhighs | black_choker | bare_shoulders | hair_bow | hair_ornament | bridal_garter | choker | frilled_bikini | navel | purple_bikini | red_bow | strapless_bikini | ass | bandeau | nail_polish | smile | stomach | thighs | blue_bikini | earrings | eyewear_removed | groin | side_ponytail |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------|:--------------------|:-------------|:--------|:-----------|:--------|:----------------|:-------|:--------------|:----------|:--------|:-----------------------|:-------------------|:---------------|:-----------------|:-----------|:----------------|:----------------|:---------|:-----------------|:--------|:----------------|:----------|:-------------------|:------|:----------|:--------------|:--------|:----------|:---------|:--------------|:-----------|:------------------|:--------|:----------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | | X | | X | | X | | X | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/l_opiniatre_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T09:56:14+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T10:05:34+00:00 |
bde353ba321e90f2176e2434e17d083380fcdaac | # Dataset Card for "araproje_mmlu_en_conf_mgpt_nearestscore_true_y"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_mmlu_en_conf_mgpt_nearestscore_true_y | [
"region:us"
] | 2024-01-14T10:02:06+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "validation", "num_bytes": 130579.0, "num_examples": 250}], "download_size": 79213, "dataset_size": 130579.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T10:02:09+00:00 |
c86c43546bf1590e95d2cc0593152fc55d78e824 | # Dataset Card for "araproje_mmlu_en_conf_mgpt_nearestscore_true_x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_mmlu_en_conf_mgpt_nearestscore_true_x | [
"region:us"
] | 2024-01-14T10:02:10+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "validation", "num_bytes": 130579.0, "num_examples": 250}], "download_size": 79223, "dataset_size": 130579.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T10:02:12+00:00 |
c0c5cd40cc3b514615ab73260f9660a4544c9f9d | # Dataset Card for "araproje_mmlu_en_conf_mgpt_nearestscore_true"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_mmlu_en_conf_mgpt_nearestscore_true | [
"region:us"
] | 2024-01-14T10:02:13+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "validation", "num_bytes": 130579.0, "num_examples": 250}], "download_size": 79132, "dataset_size": 130579.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T10:02:15+00:00 |
d07091fcadcaf1f5d214f849e5f1bac0aa6611ea | # Dataset Card for "araproje_mmlu_tr_conf_gpt2_nearestscore_true_y"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_mmlu_tr_conf_gpt2_nearestscore_true_y | [
"region:us"
] | 2024-01-14T10:04:19+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "validation", "num_bytes": 137404.0, "num_examples": 250}], "download_size": 83939, "dataset_size": 137404.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T10:04:23+00:00 |
ae484fd785c2b9e5a1dbd1a6bd85cda47f60f153 | # Dataset Card for "araproje_mmlu_tr_conf_gpt2_nearestscore_true_x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_mmlu_tr_conf_gpt2_nearestscore_true_x | [
"region:us"
] | 2024-01-14T10:04:25+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "validation", "num_bytes": 137404.0, "num_examples": 250}], "download_size": 83805, "dataset_size": 137404.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T10:04:26+00:00 |
936a90253f0615b3e8b0d23e05c4af3f0ecdb260 | # Dataset Card for "araproje_mmlu_tr_conf_gpt2_nearestscore_true"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_mmlu_tr_conf_gpt2_nearestscore_true | [
"region:us"
] | 2024-01-14T10:04:28+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "validation", "num_bytes": 137404.0, "num_examples": 250}], "download_size": 83939, "dataset_size": 137404.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T10:04:29+00:00 |
22054c099646b942710e7ff552b79119857db6e6 | # Dataset Card for "araproje_hellaswag_tr_conf_gpt2_nearestscore_true"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_hellaswag_tr_conf_gpt2_nearestscore_true | [
"region:us"
] | 2024-01-14T10:06:58+00:00 | {"dataset_info": {"features": [{"name": "ind", "dtype": "int32"}, {"name": "activity_label", "dtype": "string"}, {"name": "ctx_a", "dtype": "string"}, {"name": "ctx_b", "dtype": "string"}, {"name": "ctx", "dtype": "string"}, {"name": "endings", "sequence": "string"}, {"name": "source_id", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "split_type", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 162703.0, "num_examples": 250}], "download_size": 87144, "dataset_size": 162703.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T10:07:00+00:00 |
0b5117d3891aba6465bed03cd5f8bdfe0290ffc7 | # Dataset Card for "araproje_arc_en_conf_mgpt_nearestscore_true_y"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_arc_en_conf_mgpt_nearestscore_true_y | [
"region:us"
] | 2024-01-14T10:09:48+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "sequence": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 80031.0, "num_examples": 250}], "download_size": 46799, "dataset_size": 80031.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T10:09:50+00:00 |
515f8b42f01c57f7180ab5e6b3f7a986eaa065ad | # Dataset Card for "araproje_arc_en_conf_mgpt_nearestscore_true_x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_arc_en_conf_mgpt_nearestscore_true_x | [
"region:us"
] | 2024-01-14T10:09:53+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "sequence": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 80031.0, "num_examples": 250}], "download_size": 46916, "dataset_size": 80031.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T10:09:59+00:00 |
7ac3917b84a00ad9131b5a71e240e878a1f09a8b | # Dataset Card for "araproje_arc_en_conf_mgpt_nearestscore_true"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_arc_en_conf_mgpt_nearestscore_true | [
"region:us"
] | 2024-01-14T10:10:02+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "sequence": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 80031.0, "num_examples": 250}], "download_size": 46813, "dataset_size": 80031.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T10:10:05+00:00 |
2f0bb505a81c75c8e0b32f0d3c989863b85e50e1 | # Dataset Card for "araproje_arc_tr_conf_gpt2_nearestscore_true_y"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_arc_tr_conf_gpt2_nearestscore_true_y | [
"region:us"
] | 2024-01-14T10:11:06+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "sequence": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 86423.0, "num_examples": 250}], "download_size": 50655, "dataset_size": 86423.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T10:11:08+00:00 |
6061861e39e93e77807668fe4f14c128788e1638 | # Dataset Card for "araproje_arc_tr_conf_gpt2_nearestscore_true_x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_arc_tr_conf_gpt2_nearestscore_true_x | [
"region:us"
] | 2024-01-14T10:11:11+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "sequence": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 86423.0, "num_examples": 250}], "download_size": 50724, "dataset_size": 86423.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T10:11:13+00:00 |
e9de7d62f6428047dd7d2f1b7a03e8f308bb457b | # Dataset Card for "araproje_arc_tr_conf_gpt2_nearestscore_true"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_arc_tr_conf_gpt2_nearestscore_true | [
"region:us"
] | 2024-01-14T10:11:16+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "sequence": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 86423.0, "num_examples": 250}], "download_size": 50681, "dataset_size": 86423.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T10:11:17+00:00 |
8104069faa6579b509248e184d53e6aada977914 |
# Dataset of le_mars/ル・マルス/勒马尔 (Azur Lane)
This is the dataset of le_mars/ル・マルス/勒马尔 (Azur Lane), containing 24 images and their tags.
The core tags of this character are `brown_hair, short_hair, blue_eyes, bow, breasts, ahoge, green_eyes, hair_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 30.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/le_mars_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 18.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/le_mars_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 48 | 33.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/le_mars_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 26.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/le_mars_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 48 | 48.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/le_mars_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/le_mars_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | looking_at_viewer, 1girl, solo, black_gloves, fingerless_gloves, shorts, smile, white_background, bare_shoulders, blush, cannon, full_body, hair_ornament, holding_weapon, machinery, navel, rigging, thighhighs, turret, bangs, closed_mouth, dark-skinned_female, simple_background, standing, sword |
| 1 | 6 |  |  |  |  |  | double_bun, looking_at_viewer, open_mouth, smile, 1girl, blue_bikini, solo, ;d, antenna_hair, ass, blush, hood, innertube, one-piece_tan, one_eye_closed, small_breasts, torpedo, water, barefoot, beachball, blue_sky, cloud, day, official_alternate_costume, outdoors, polka_dot_bikini, wrist_scrunchie |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | 1girl | solo | black_gloves | fingerless_gloves | shorts | smile | white_background | bare_shoulders | blush | cannon | full_body | hair_ornament | holding_weapon | machinery | navel | rigging | thighhighs | turret | bangs | closed_mouth | dark-skinned_female | simple_background | standing | sword | double_bun | open_mouth | blue_bikini | ;d | antenna_hair | ass | hood | innertube | one-piece_tan | one_eye_closed | small_breasts | torpedo | water | barefoot | beachball | blue_sky | cloud | day | official_alternate_costume | outdoors | polka_dot_bikini | wrist_scrunchie |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:--------|:-------|:---------------|:--------------------|:---------|:--------|:-------------------|:-----------------|:--------|:---------|:------------|:----------------|:-----------------|:------------|:--------|:----------|:-------------|:---------|:--------|:---------------|:----------------------|:--------------------|:-----------|:--------|:-------------|:-------------|:--------------|:-----|:---------------|:------|:-------|:------------|:----------------|:-----------------|:----------------|:----------|:--------|:-----------|:------------|:-----------|:--------|:------|:-----------------------------|:-----------|:-------------------|:------------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | | | | X | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/le_mars_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T10:11:26+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T10:18:02+00:00 |
57b8f84260ff6580093d5d8b2c87e6ce23a09392 |
# Dataset of nagatsuki/長月/长月 (Azur Lane)
This is the dataset of nagatsuki/長月/长月 (Azur Lane), containing 21 images and their tags.
The core tags of this character are `animal_ears, long_hair, brown_hair, dog_ears, purple_eyes, hair_ornament, tail, dog_tail, crescent_hair_ornament, fang, hat, ribbon, school_hat, hairclip, side_ponytail, bangs, dog_girl, very_long_hair, yellow_headwear, bow, hair_between_eyes, hair_bow, candy_hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 21 | 24.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagatsuki_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 21 | 14.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagatsuki_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 50 | 30.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagatsuki_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 21 | 22.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagatsuki_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 50 | 42.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagatsuki_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nagatsuki_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 |  |  |  |  |  | blush, open_mouth, 1girl, crescent, smile, solo, looking_at_viewer, blue_shirt, kindergarten_uniform, long_sleeves, pantyhose, blue_skirt, school_uniform |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | open_mouth | 1girl | crescent | smile | solo | looking_at_viewer | blue_shirt | kindergarten_uniform | long_sleeves | pantyhose | blue_skirt | school_uniform |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------|:-----------|:--------|:-------|:--------------------|:-------------|:-----------------------|:---------------|:------------|:-------------|:-----------------|
| 0 | 21 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/nagatsuki_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T10:11:31+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T10:23:39+00:00 |
c6f81191b41e1f772229e4a84cf21d4572901ed0 |
# Dataset of chaser/チェイサー/追赶者 (Azur Lane)
This is the dataset of chaser/チェイサー/追赶者 (Azur Lane), containing 11 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, breasts, large_breasts, long_hair, bangs, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 13.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chaser_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 8.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chaser_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 24 | 14.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chaser_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 12.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chaser_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 24 | 20.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chaser_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chaser_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, cleavage, dress, open_mouth, holding, simple_background, white_background, closed_mouth, full_body, long_sleeves, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | solo | cleavage | dress | open_mouth | holding | simple_background | white_background | closed_mouth | full_body | long_sleeves | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:-----------|:--------|:-------------|:----------|:--------------------|:-------------------|:---------------|:------------|:---------------|:-------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/chaser_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T10:12:17+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T10:14:55+00:00 |
11e034e491f9c05c19ecf7c7a212cc3942234581 | Anna15/Vektor | [
"region:us"
] | 2024-01-14T10:19:27+00:00 | {} | 2024-01-14T10:19:29+00:00 |
|
4012ae6685f5a4f43866ef8eb4b2c36a2637ee87 | jtatman/pile_python_instruct_format | [
"region:us"
] | 2024-01-14T10:22:17+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "system", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4154192592, "num_examples": 3622953}], "download_size": 1877354904, "dataset_size": 4154192592}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T10:37:15+00:00 |
|
7b1f7e74116063ea5bfab83697e17b3d6bbdb0c2 |
# Dataset Card for Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v3](https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T10:22:49.140128](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v3/blob/main/results_2024-01-14T10-22-49.140128.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6208194474020036,
"acc_stderr": 0.03248977876941496,
"acc_norm": 0.6279979276428014,
"acc_norm_stderr": 0.03316548494950365,
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6128599699123188,
"mc2_stderr": 0.01559006777346825
},
"harness|arc:challenge|25": {
"acc": 0.6339590443686007,
"acc_stderr": 0.01407722310847014,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902274
},
"harness|hellaswag|10": {
"acc": 0.6611232822146983,
"acc_stderr": 0.0047236053769369145,
"acc_norm": 0.8453495319657439,
"acc_norm_stderr": 0.0036083220651418834
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224469,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224469
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851105,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851105
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647078,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647078
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606649,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606649
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848043,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848043
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069443,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001505,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001505
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.02475241196091721,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.02475241196091721
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187303,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187303
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.025773111169630457,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.025773111169630457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032199,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032199
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882536,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882536
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.02740385941078685,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.02740385941078685
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727668,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727668
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6128599699123188,
"mc2_stderr": 0.01559006777346825
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774099
},
"harness|gsm8k|5": {
"acc": 0.25549658832448824,
"acc_stderr": 0.012013462405460069
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v3 | [
"region:us"
] | 2024-01-14T10:25:11+00:00 | {"pretty_name": "Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v3](https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T10:22:49.140128](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v3/blob/main/results_2024-01-14T10-22-49.140128.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6208194474020036,\n \"acc_stderr\": 0.03248977876941496,\n \"acc_norm\": 0.6279979276428014,\n \"acc_norm_stderr\": 0.03316548494950365,\n \"mc1\": 0.4418604651162791,\n \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6128599699123188,\n \"mc2_stderr\": 0.01559006777346825\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6339590443686007,\n \"acc_stderr\": 0.01407722310847014,\n \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902274\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6611232822146983,\n \"acc_stderr\": 0.0047236053769369145,\n \"acc_norm\": 0.8453495319657439,\n \"acc_norm_stderr\": 0.0036083220651418834\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224469,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224469\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851105,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851105\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.7354838709677419,\n \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647078,\n \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647078\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606649,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606649\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848043,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848043\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069443,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n \"acc_stderr\": 0.013927751372001505,\n \"acc_norm\": 0.8135376756066411,\n \"acc_norm_stderr\": 0.013927751372001505\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.02475241196091721,\n \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.02475241196091721\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n \"acc_stderr\": 0.015748421208187303,\n \"acc_norm\": 0.3318435754189944,\n \"acc_norm_stderr\": 0.015748421208187303\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630457,\n \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630457\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n \"acc_stderr\": 0.012719949543032199,\n \"acc_norm\": 0.4556714471968709,\n \"acc_norm_stderr\": 0.012719949543032199\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882536,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882536\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.02740385941078685,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.02740385941078685\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4418604651162791,\n \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6128599699123188,\n \"mc2_stderr\": 0.01559006777346825\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774099\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.25549658832448824,\n \"acc_stderr\": 0.012013462405460069\n }\n}\n```", "repo_url": "https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|arc:challenge|25_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|gsm8k|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hellaswag|10_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T10-22-49.140128.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["**/details_harness|winogrande|5_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T10-22-49.140128.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T10_22_49.140128", "path": ["results_2024-01-14T10-22-49.140128.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T10-22-49.140128.parquet"]}]}]} | 2024-01-14T10:25:31+00:00 |
1463f11dd6850215f1515ca927d4dd05612a18c8 |
# Dataset Card for Evaluation run of FelixChao/NarutoDolphin-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/NarutoDolphin-7B](https://huggingface.co/FelixChao/NarutoDolphin-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__NarutoDolphin-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T10:24:57.162360](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__NarutoDolphin-7B/blob/main/results_2024-01-14T10-24-57.162360.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6306583942825644,
"acc_stderr": 0.03252627508388141,
"acc_norm": 0.632276909104878,
"acc_norm_stderr": 0.03317986227116511,
"mc1": 0.40514075887392903,
"mc1_stderr": 0.01718561172775337,
"mc2": 0.5912860013096678,
"mc2_stderr": 0.015586868131613507
},
"harness|arc:challenge|25": {
"acc": 0.6220136518771331,
"acc_stderr": 0.014169664520303098,
"acc_norm": 0.6382252559726962,
"acc_norm_stderr": 0.014041957945038083
},
"harness|hellaswag|10": {
"acc": 0.6542521410077674,
"acc_stderr": 0.0047463946133845325,
"acc_norm": 0.841665006970723,
"acc_norm_stderr": 0.0036430875292137216
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137282,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137282
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6528301886792452,
"acc_stderr": 0.029300101705549652,
"acc_norm": 0.6528301886792452,
"acc_norm_stderr": 0.029300101705549652
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895536,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895536
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124484,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124484
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396993,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396993
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763744,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763744
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612907,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612907
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640773,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640773
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.01374079725857982,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.01374079725857982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3642458100558659,
"acc_stderr": 0.0160943387684746,
"acc_norm": 0.3642458100558659,
"acc_norm_stderr": 0.0160943387684746
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399662,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399662
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.02563082497562135,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.02563082497562135
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.012687818419599923,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.012687818419599923
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013014,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013014
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40514075887392903,
"mc1_stderr": 0.01718561172775337,
"mc2": 0.5912860013096678,
"mc2_stderr": 0.015586868131613507
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126735
},
"harness|gsm8k|5": {
"acc": 0.5943896891584534,
"acc_stderr": 0.013524848894462115
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_FelixChao__NarutoDolphin-7B | [
"region:us"
] | 2024-01-14T10:27:14+00:00 | {"pretty_name": "Evaluation run of FelixChao/NarutoDolphin-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/NarutoDolphin-7B](https://huggingface.co/FelixChao/NarutoDolphin-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__NarutoDolphin-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T10:24:57.162360](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__NarutoDolphin-7B/blob/main/results_2024-01-14T10-24-57.162360.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6306583942825644,\n \"acc_stderr\": 0.03252627508388141,\n \"acc_norm\": 0.632276909104878,\n \"acc_norm_stderr\": 0.03317986227116511,\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.5912860013096678,\n \"mc2_stderr\": 0.015586868131613507\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6220136518771331,\n \"acc_stderr\": 0.014169664520303098,\n \"acc_norm\": 0.6382252559726962,\n \"acc_norm_stderr\": 0.014041957945038083\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6542521410077674,\n \"acc_stderr\": 0.0047463946133845325,\n \"acc_norm\": 0.841665006970723,\n \"acc_norm_stderr\": 0.0036430875292137216\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137282,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137282\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6528301886792452,\n \"acc_stderr\": 0.029300101705549652,\n \"acc_norm\": 0.6528301886792452,\n \"acc_norm_stderr\": 0.029300101705549652\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895536,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895536\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.03252909619613197,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.03252909619613197\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124484,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124484\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396993,\n \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396993\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763744,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763744\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612907,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612907\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640773,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640773\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.01374079725857982,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.01374079725857982\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3642458100558659,\n \"acc_stderr\": 0.0160943387684746,\n \"acc_norm\": 0.3642458100558659,\n \"acc_norm_stderr\": 0.0160943387684746\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399662,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399662\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.02563082497562135,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.02563082497562135\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n \"acc_stderr\": 0.012687818419599923,\n \"acc_norm\": 0.44328552803129073,\n \"acc_norm_stderr\": 0.012687818419599923\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.027686913588013014,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.027686913588013014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.5912860013096678,\n \"mc2_stderr\": 0.015586868131613507\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126735\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5943896891584534,\n \"acc_stderr\": 0.013524848894462115\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/NarutoDolphin-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|arc:challenge|25_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|gsm8k|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hellaswag|10_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T10-24-57.162360.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["**/details_harness|winogrande|5_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T10-24-57.162360.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T10_24_57.162360", "path": ["results_2024-01-14T10-24-57.162360.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T10-24-57.162360.parquet"]}]}]} | 2024-01-14T10:27:33+00:00 |
42faef7ff2434d5268278162e5fd9da75b9537dc | mncai/distilabel-math-preference-dpo-ko | [
"region:us"
] | 2024-01-14T10:27:20+00:00 | {} | 2024-01-14T10:27:21+00:00 |
|
72292bed93c18a41807f5af8307567b079bea25c | mncai/finance-tasks-ConvFinQA-ko | [
"region:us"
] | 2024-01-14T10:28:24+00:00 | {} | 2024-01-14T10:28:24+00:00 |
|
28053aefbe5ee31647d467c78629e3606c240477 | mncai/ultrafeedback_binarized_cleaned-ko | [
"region:us"
] | 2024-01-14T10:31:16+00:00 | {} | 2024-01-14T10:31:27+00:00 |
|
99cdf17ce9b89e217dc48bce370955ef5b1badad |
# Dataset of emanuele_pessagno/エマヌエーレ・ペッサーニョ/埃曼努埃尔·佩萨格诺 (Azur Lane)
This is the dataset of emanuele_pessagno/エマヌエーレ・ペッサーニョ/埃曼努埃尔·佩萨格诺 (Azur Lane), containing 13 images and their tags.
The core tags of this character are `long_hair, pink_hair, breasts, pink_eyes, bangs, hairband, hair_between_eyes, large_breasts, purple_eyes, ahoge, bow, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 21.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/emanuele_pessagno_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 10.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/emanuele_pessagno_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 28 | 21.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/emanuele_pessagno_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 18.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/emanuele_pessagno_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 28 | 32.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/emanuele_pessagno_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/emanuele_pessagno_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, cleavage, frills, long_sleeves, white_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | cleavage | frills | long_sleeves | white_dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:-----------|:---------|:---------------|:--------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X |
| CyberHarem/emanuele_pessagno_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T10:32:05+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T10:35:28+00:00 |
bd1926e07818a06bc46e711a2dcfc39bd7d5932f |
# Dataset of u_410/U-410 (Azur Lane)
This is the dataset of u_410/U-410 (Azur Lane), containing 13 images and their tags.
The core tags of this character are `breasts, grey_hair, red_eyes, long_hair, medium_breasts, mole, mole_under_eye, bangs, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 20.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_410_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 12.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_410_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 31 | 23.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_410_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 18.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_410_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 31 | 33.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_410_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/u_410_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | looking_at_viewer, 1girl, bare_shoulders, solo, black_one-piece_swimsuit, iron_cross, red_gloves, underboob, choker, leg_tattoo, smile, thighs, cross_necklace, holding, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | 1girl | bare_shoulders | solo | black_one-piece_swimsuit | iron_cross | red_gloves | underboob | choker | leg_tattoo | smile | thighs | cross_necklace | holding | simple_background | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:--------|:-----------------|:-------|:---------------------------|:-------------|:-------------|:------------|:---------|:-------------|:--------|:---------|:-----------------|:----------|:--------------------|:-------------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/u_410_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T10:32:09+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T10:36:36+00:00 |
527effe780b58c5674234ccb068e4a55917e7a20 | Orenbac/news_raw | [
"region:us"
] | 2024-01-14T10:32:41+00:00 | {} | 2024-01-14T10:56:30+00:00 |
|
c641405bf626756110fa82fe99eb0730b793e624 | satpalsr/translation | [
"region:us"
] | 2024-01-14T10:34:39+00:00 | {} | 2024-01-14T10:34:48+00:00 |
|
25c2673f71080904fb3d71e0c58a562db21cef29 |
# Dataset Card for Evaluation run of macadeliccc/Orca-SOLAR-4x10.7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/Orca-SOLAR-4x10.7b](https://huggingface.co/macadeliccc/Orca-SOLAR-4x10.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__Orca-SOLAR-4x10.7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T10:39:27.836739](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__Orca-SOLAR-4x10.7b/blob/main/results_2024-01-14T10-39-27.836739.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.673209053846357,
"acc_stderr": 0.03136589520436306,
"acc_norm": 0.6739418750343522,
"acc_norm_stderr": 0.032009470403028546,
"mc1": 0.48225214198286415,
"mc1_stderr": 0.017492470843075356,
"mc2": 0.6453861832315595,
"mc2_stderr": 0.015356815946066372
},
"harness|arc:challenge|25": {
"acc": 0.6527303754266212,
"acc_stderr": 0.013913034529620455,
"acc_norm": 0.6851535836177475,
"acc_norm_stderr": 0.013572657703084948
},
"harness|hellaswag|10": {
"acc": 0.6820354511053575,
"acc_stderr": 0.004647338877642187,
"acc_norm": 0.867755427205736,
"acc_norm_stderr": 0.0033806414709899313
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6382978723404256,
"acc_stderr": 0.03141082197596239,
"acc_norm": 0.6382978723404256,
"acc_norm_stderr": 0.03141082197596239
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.025722097064388535,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.025722097064388535
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.02203721734026784,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.02203721734026784
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284336,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328972,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328972
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970562,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970562
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342856,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342856
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8676470588235294,
"acc_stderr": 0.023784297520918856,
"acc_norm": 0.8676470588235294,
"acc_norm_stderr": 0.023784297520918856
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.022535526352692705,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.022535526352692705
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7399103139013453,
"acc_stderr": 0.029442495585857476,
"acc_norm": 0.7399103139013453,
"acc_norm_stderr": 0.029442495585857476
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.0134682016140663,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.0134682016140663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.02370309952525817,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.02370309952525817
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.016513676031179595,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.016513676031179595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7973856209150327,
"acc_stderr": 0.02301544687798568,
"acc_norm": 0.7973856209150327,
"acc_norm_stderr": 0.02301544687798568
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218894,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218894
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5058670143415906,
"acc_stderr": 0.012769356925216526,
"acc_norm": 0.5058670143415906,
"acc_norm_stderr": 0.012769356925216526
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7536764705882353,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.7536764705882353,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.018521756215423027,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.018521756215423027
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866764,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48225214198286415,
"mc1_stderr": 0.017492470843075356,
"mc2": 0.6453861832315595,
"mc2_stderr": 0.015356815946066372
},
"harness|winogrande|5": {
"acc": 0.8389897395422258,
"acc_stderr": 0.010329712832785718
},
"harness|gsm8k|5": {
"acc": 0.6823351023502654,
"acc_stderr": 0.01282406662148884
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_macadeliccc__Orca-SOLAR-4x10.7b | [
"region:us"
] | 2024-01-14T10:41:42+00:00 | {"pretty_name": "Evaluation run of macadeliccc/Orca-SOLAR-4x10.7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [macadeliccc/Orca-SOLAR-4x10.7b](https://huggingface.co/macadeliccc/Orca-SOLAR-4x10.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__Orca-SOLAR-4x10.7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T10:39:27.836739](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__Orca-SOLAR-4x10.7b/blob/main/results_2024-01-14T10-39-27.836739.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.673209053846357,\n \"acc_stderr\": 0.03136589520436306,\n \"acc_norm\": 0.6739418750343522,\n \"acc_norm_stderr\": 0.032009470403028546,\n \"mc1\": 0.48225214198286415,\n \"mc1_stderr\": 0.017492470843075356,\n \"mc2\": 0.6453861832315595,\n \"mc2_stderr\": 0.015356815946066372\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6527303754266212,\n \"acc_stderr\": 0.013913034529620455,\n \"acc_norm\": 0.6851535836177475,\n \"acc_norm_stderr\": 0.013572657703084948\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6820354511053575,\n \"acc_stderr\": 0.004647338877642187,\n \"acc_norm\": 0.867755427205736,\n \"acc_norm_stderr\": 0.0033806414709899313\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6382978723404256,\n \"acc_stderr\": 0.03141082197596239,\n \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.03141082197596239\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.025722097064388535,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.025722097064388535\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n \"acc_stderr\": 0.02203721734026784,\n \"acc_norm\": 0.8161290322580645,\n \"acc_norm_stderr\": 0.02203721734026784\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284336,\n \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328972,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328972\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970562,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970562\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342856,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342856\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8676470588235294,\n \"acc_stderr\": 0.023784297520918856,\n \"acc_norm\": 0.8676470588235294,\n \"acc_norm_stderr\": 0.023784297520918856\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7399103139013453,\n \"acc_stderr\": 0.029442495585857476,\n \"acc_norm\": 0.7399103139013453,\n \"acc_norm_stderr\": 0.029442495585857476\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.0134682016140663,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.0134682016140663\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.02370309952525817,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.02370309952525817\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n \"acc_stderr\": 0.016513676031179595,\n \"acc_norm\": 0.4212290502793296,\n \"acc_norm_stderr\": 0.016513676031179595\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7973856209150327,\n \"acc_stderr\": 0.02301544687798568,\n \"acc_norm\": 0.7973856209150327,\n \"acc_norm_stderr\": 0.02301544687798568\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218894,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218894\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5058670143415906,\n \"acc_stderr\": 0.012769356925216526,\n \"acc_norm\": 0.5058670143415906,\n \"acc_norm_stderr\": 0.012769356925216526\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7536764705882353,\n \"acc_stderr\": 0.02617343857052,\n \"acc_norm\": 0.7536764705882353,\n \"acc_norm_stderr\": 0.02617343857052\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7009803921568627,\n \"acc_stderr\": 0.018521756215423027,\n \"acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.018521756215423027\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866764,\n \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866764\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48225214198286415,\n \"mc1_stderr\": 0.017492470843075356,\n \"mc2\": 0.6453861832315595,\n \"mc2_stderr\": 0.015356815946066372\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8389897395422258,\n \"acc_stderr\": 0.010329712832785718\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6823351023502654,\n \"acc_stderr\": 0.01282406662148884\n }\n}\n```", "repo_url": "https://huggingface.co/macadeliccc/Orca-SOLAR-4x10.7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|arc:challenge|25_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|gsm8k|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hellaswag|10_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T10-39-27.836739.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["**/details_harness|winogrande|5_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T10-39-27.836739.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T10_39_27.836739", "path": ["results_2024-01-14T10-39-27.836739.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T10-39-27.836739.parquet"]}]}]} | 2024-01-14T10:42:01+00:00 |
b71b9371daf7f3aeab07557c3597c29be0246c86 | 316usman/medical | [
"license:bsd",
"region:us"
] | 2024-01-14T10:51:40+00:00 | {"license": "bsd", "dataset_info": {"features": [{"name": "vector", "sequence": "float32"}, {"name": "metadata", "struct": [{"name": "text", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 955472430, "num_examples": 158114}], "download_size": 898315744, "dataset_size": 955472430}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T15:40:14+00:00 |
|
e5ec115aa46829c8db7fe99345a798aa56184cf6 | vatavusara/drugs | [
"license:mit",
"region:us"
] | 2024-01-14T10:55:42+00:00 | {"license": "mit"} | 2024-01-14T12:10:03+00:00 |
|
843f9f9bf57098b36b19892a42f2a4bf41f9a963 | DMLuck/phi-model-data | [
"region:us"
] | 2024-01-14T11:00:57+00:00 | {} | 2024-01-14T11:00:58+00:00 |
|
50c09914dd1c8946145b3155fd66fa6c34497b02 | nbroad/fewnerd-organizations | [
"region:us"
] | 2024-01-14T11:01:17+00:00 | {"dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "ner_tags", "sequence": {"class_label": {"names": {"0": "O", "1": "B-ORG", "2": "I-ORG"}}}}], "splits": [{"name": "train", "num_bytes": 50533000, "num_examples": 122459}, {"name": "test", "num_bytes": 15189310, "num_examples": 36738}, {"name": "validation", "num_bytes": 6485434, "num_examples": 15745}], "download_size": 18458832, "dataset_size": 72207744}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T11:01:32+00:00 |
|
fed2732cc0142bc7cec4e2f0a0a413b913b2ead5 |
# Dataset of bush/ブッシュ/布什 (Azur Lane)
This is the dataset of bush/ブッシュ/布什 (Azur Lane), containing 12 images and their tags.
The core tags of this character are `hat, purple_eyes, purple_hair, short_hair, hair_ornament, hairclip, bangs, beret`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 7.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bush_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 6.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bush_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 24 | 11.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bush_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 7.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bush_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 24 | 13.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bush_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/bush_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, jacket, looking_at_viewer, solo, blush, smile, white_background, long_sleeves, open_mouth, shoes, simple_background, collarbone, holding, hood, white_footwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | jacket | looking_at_viewer | solo | blush | smile | white_background | long_sleeves | open_mouth | shoes | simple_background | collarbone | holding | hood | white_footwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:--------------------|:-------|:--------|:--------|:-------------------|:---------------|:-------------|:--------|:--------------------|:-------------|:----------|:-------|:-----------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/bush_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T11:05:27+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T11:07:42+00:00 |
d69fd1b9c4ddce3570db8783c16cab2c9eb97ad9 |
# Dataset of dace/デイス/鲦鱼 (Azur Lane)
This is the dataset of dace/デイス/鲦鱼 (Azur Lane), containing 14 images and their tags.
The core tags of this character are `blue_eyes, breasts, pink_hair, medium_breasts, ponytail, bow, fang, hair_bow, bangs, long_hair, black_bow, hair_ornament, parted_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 14 | 14.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dace_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 14 | 8.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dace_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 31 | 18.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dace_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 14 | 12.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dace_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 31 | 27.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dace_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dace_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, blush, open_mouth, solo, looking_at_viewer, navel, one-piece_swimsuit, bare_shoulders, elbow_gloves, :d, leotard, simple_background, toeless_legwear, black_gloves, highleg, star_(symbol), straddling, torpedo, watercraft, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | open_mouth | solo | looking_at_viewer | navel | one-piece_swimsuit | bare_shoulders | elbow_gloves | :d | leotard | simple_background | toeless_legwear | black_gloves | highleg | star_(symbol) | straddling | torpedo | watercraft | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------------|:-------|:--------------------|:--------|:---------------------|:-----------------|:---------------|:-----|:----------|:--------------------|:------------------|:---------------|:----------|:----------------|:-------------|:----------|:-------------|:-------------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/dace_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T11:05:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T11:09:27+00:00 |
2b9b3441cc5df6764d7ebf9de843065d7765a276 |
# Dataset of hornet_ii/ホーネットII/大黄蜂II (Azur Lane)
This is the dataset of hornet_ii/ホーネットII/大黄蜂II (Azur Lane), containing 23 images and their tags.
The core tags of this character are `blonde_hair, long_hair, bangs, breasts, green_eyes, large_breasts, twintails, very_long_hair, hat, black_headwear, cowboy_hat, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 23 | 30.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hornet_ii_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 23 | 19.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hornet_ii_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 58 | 40.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hornet_ii_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 23 | 27.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hornet_ii_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 58 | 51.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hornet_ii_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hornet_ii_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, bikini_top_only, black_bikini, cleavage, short_shorts, black_gloves, fingerless_gloves, looking_at_viewer, midriff, solo, black_shorts, blush, shrug_(clothing), black_thighhighs, grin, navel, standing, boots, belt, cowboy_shot, one_eye_closed, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bikini_top_only | black_bikini | cleavage | short_shorts | black_gloves | fingerless_gloves | looking_at_viewer | midriff | solo | black_shorts | blush | shrug_(clothing) | black_thighhighs | grin | navel | standing | boots | belt | cowboy_shot | one_eye_closed | simple_background | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------------|:---------------|:-----------|:---------------|:---------------|:--------------------|:--------------------|:----------|:-------|:---------------|:--------|:-------------------|:-------------------|:-------|:--------|:-----------|:--------|:-------|:--------------|:-----------------|:--------------------|:-------------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/hornet_ii_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T11:06:43+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T11:10:51+00:00 |
542b566e1bb8df76d594b57f631a7e19de2a298b |
# Dataset Card for Evaluation run of Abhinav7/NeuralPipe-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Abhinav7/NeuralPipe-7B-slerp](https://huggingface.co/Abhinav7/NeuralPipe-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Abhinav7__NeuralPipe-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T11:22:42.602148](https://huggingface.co/datasets/open-llm-leaderboard/details_Abhinav7__NeuralPipe-7B-slerp/blob/main/results_2024-01-14T11-22-42.602148.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6447169263171724,
"acc_stderr": 0.03211493893533018,
"acc_norm": 0.6450175117328331,
"acc_norm_stderr": 0.03277128130072703,
"mc1": 0.43084455324357407,
"mc1_stderr": 0.017335272475332366,
"mc2": 0.5982418830210784,
"mc2_stderr": 0.01515275893598861
},
"harness|arc:challenge|25": {
"acc": 0.6467576791808873,
"acc_stderr": 0.013967822714840055,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.013697432466693252
},
"harness|hellaswag|10": {
"acc": 0.6692889862577176,
"acc_stderr": 0.004695076629884537,
"acc_norm": 0.8611830312686716,
"acc_norm_stderr": 0.003450488042965012
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.02412112541694119,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.02412112541694119
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459754,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459754
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.02386800326250011,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.02386800326250011
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.358659217877095,
"acc_stderr": 0.016040454426164474,
"acc_norm": 0.358659217877095,
"acc_norm_stderr": 0.016040454426164474
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.012750151802922438,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.012750151802922438
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399673,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399673
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43084455324357407,
"mc1_stderr": 0.017335272475332366,
"mc2": 0.5982418830210784,
"mc2_stderr": 0.01515275893598861
},
"harness|winogrande|5": {
"acc": 0.797947908445146,
"acc_stderr": 0.011285013754047436
},
"harness|gsm8k|5": {
"acc": 0.6929492039423806,
"acc_stderr": 0.01270568572313171
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Abhinav7__NeuralPipe-7B-slerp | [
"region:us"
] | 2024-01-14T11:25:01+00:00 | {"pretty_name": "Evaluation run of Abhinav7/NeuralPipe-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Abhinav7/NeuralPipe-7B-slerp](https://huggingface.co/Abhinav7/NeuralPipe-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Abhinav7__NeuralPipe-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T11:22:42.602148](https://huggingface.co/datasets/open-llm-leaderboard/details_Abhinav7__NeuralPipe-7B-slerp/blob/main/results_2024-01-14T11-22-42.602148.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6447169263171724,\n \"acc_stderr\": 0.03211493893533018,\n \"acc_norm\": 0.6450175117328331,\n \"acc_norm_stderr\": 0.03277128130072703,\n \"mc1\": 0.43084455324357407,\n \"mc1_stderr\": 0.017335272475332366,\n \"mc2\": 0.5982418830210784,\n \"mc2_stderr\": 0.01515275893598861\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6467576791808873,\n \"acc_stderr\": 0.013967822714840055,\n \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.013697432466693252\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6692889862577176,\n \"acc_stderr\": 0.004695076629884537,\n \"acc_norm\": 0.8611830312686716,\n \"acc_norm_stderr\": 0.003450488042965012\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.02386800326250011,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.02386800326250011\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.358659217877095,\n \"acc_stderr\": 0.016040454426164474,\n \"acc_norm\": 0.358659217877095,\n \"acc_norm_stderr\": 0.016040454426164474\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n \"acc_stderr\": 0.012750151802922438,\n \"acc_norm\": 0.47196870925684486,\n \"acc_norm_stderr\": 0.012750151802922438\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43084455324357407,\n \"mc1_stderr\": 0.017335272475332366,\n \"mc2\": 0.5982418830210784,\n \"mc2_stderr\": 0.01515275893598861\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.797947908445146,\n \"acc_stderr\": 0.011285013754047436\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6929492039423806,\n \"acc_stderr\": 0.01270568572313171\n }\n}\n```", "repo_url": "https://huggingface.co/Abhinav7/NeuralPipe-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|arc:challenge|25_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|gsm8k|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hellaswag|10_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T11-22-42.602148.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["**/details_harness|winogrande|5_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T11-22-42.602148.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T11_22_42.602148", "path": ["results_2024-01-14T11-22-42.602148.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T11-22-42.602148.parquet"]}]}]} | 2024-01-14T11:25:22+00:00 |
d9fa0f22444cdacd366b8596557f19ecf043542a |
# Dataset of painleve/パンルヴェ/伴尔维 (Azur Lane)
This is the dataset of painleve/パンルヴェ/伴尔维 (Azur Lane), containing 29 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, long_hair, breasts, large_breasts, bangs, very_long_hair, hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 29 | 47.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/painleve_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 29 | 23.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/painleve_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 69 | 48.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/painleve_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 29 | 39.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/painleve_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 69 | 73.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/painleve_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/painleve_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, looking_at_viewer, solo, tiara, white_gloves, dress, blush, closed_mouth, elbow_gloves |
| 1 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, ponytail, bare_shoulders, hair_ornament, solo, black_choker, blush, ass, parted_lips, sports_bra, yoga_pants, black_pants, sweat, from_behind, indoors, looking_back, window |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | tiara | white_gloves | dress | blush | closed_mouth | elbow_gloves | ponytail | bare_shoulders | hair_ornament | black_choker | ass | parted_lips | sports_bra | yoga_pants | black_pants | sweat | from_behind | indoors | looking_back | window |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:---------------|:--------|:--------|:---------------|:---------------|:-----------|:-----------------|:----------------|:---------------|:------|:--------------|:-------------|:-------------|:--------------|:--------|:--------------|:----------|:---------------|:---------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/painleve_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T11:25:39+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T11:32:13+00:00 |
ee802a7a3b51d96363bc424b44f8679b5c9e27fa |
# Dataset of bellona/ベローナ/司战女神 (Azur Lane)
This is the dataset of bellona/ベローナ/司战女神 (Azur Lane), containing 52 images and their tags.
The core tags of this character are `breasts, short_hair, hair_between_eyes, large_breasts, bangs, purple_eyes, grey_hair, maid_headdress, purple_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 52 | 72.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bellona_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 52 | 36.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bellona_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 129 | 79.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bellona_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 52 | 61.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bellona_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 129 | 118.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bellona_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/bellona_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 26 |  |  |  |  |  | 1girl, solo, looking_at_viewer, underboob_cutout, black_dress, smile, official_alternate_costume, white_gloves, closed_mouth, simple_background, white_background, standing, white_apron, juliet_sleeves, maid_apron |
| 1 | 7 |  |  |  |  |  | crop_top, midriff, navel, 1girl, long_sleeves, smile, white_background, cowboy_shot, looking_at_viewer, pants, simple_background, solo, stomach, collarbone, sweater, denim, holding, standing, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | underboob_cutout | black_dress | smile | official_alternate_costume | white_gloves | closed_mouth | simple_background | white_background | standing | white_apron | juliet_sleeves | maid_apron | crop_top | midriff | navel | long_sleeves | cowboy_shot | pants | stomach | collarbone | sweater | denim | holding | white_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-------------------|:--------------|:--------|:-----------------------------|:---------------|:---------------|:--------------------|:-------------------|:-----------|:--------------|:-----------------|:-------------|:-----------|:----------|:--------|:---------------|:--------------|:--------|:----------|:-------------|:----------|:--------|:----------|:--------------|
| 0 | 26 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | | X | | | | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/bellona_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T11:25:41+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T11:42:07+00:00 |
042e2ddf19d64a844ad596804bfb05d2c99401b8 | Orenbac/news_summarized | [
"region:us"
] | 2024-01-14T11:28:40+00:00 | {} | 2024-01-14T12:46:39+00:00 |
|
f5347c1665a0c96c54d793b7038372c3bee42cba | Jassarubat/female-khaliji | [
"region:us"
] | 2024-01-14T11:34:51+00:00 | {} | 2024-01-14T11:43:04+00:00 |
|
0055800b84d1147a52d40acbcaa6fc69e5f66395 |
<h1 align="center">🌸 Synthetic Haiku Prompts 🌸</h1>
<p align="center">
<img src="https://cdn-uploads.huggingface.co/production/uploads/60107b385ac3e86b3ea4fc34/nmz7lvu64BytxDvPMm1C5.png" alt="Banner for a dataset card featuring a fusion of digital and traditional Japanese elements. The design includes stylized digital prompts and haikus within text bubbles and on digital screens, set against a backdrop of delicate cherry blossoms and a serene Japanese landscape. The color scheme is dominated by soft pastel pink tones, creating a harmonious blend of modern technology and classical poetry aesthetics." width="500">
</p>
<p align="center"><em>In data's embrace,<br>Synthetic haiku wishes bloom,<br>
Code-born poetry.<br>
</em></p>
# Dataset Card for Synthetic Haiku Prompts
[<img src="https://raw.githubusercontent.com/argilla-io/distilabel/main/docs/assets/distilabel-badge-light.png" alt="Built with Distilabel" width="200" height="32"/>](https://github.com/argilla-io/distilabel)
## Dataset Details
This is a dataset of synthetic prompts that aims to replicate user requests to a chat model for a haiku about a given topic. The data was generated using the [distilabel](https://github.com/argilla-io/distilabel) library using [teknium](https://huggingface.co/teknium)'s [OpenHermes-2.5-Mistral-7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B) model. The prompts were generated from a seed list of terms and an adapted version of the [SELF-INSTRUCT](https://arxiv.org/abs/2212.10560) papers prompting strategy.
This dataset was primarily constructed as part of a broader project to explore the extent to which open models and Direct Preference Optimization (DPO) can be used to generate synthetic data that can be used to effectively cultivate desired behavior in language models (in this case the ability to write haikus). The project is a WIP and is primarily a learning exercise for the author, but the dataset is being released in the hopes that it may be useful to others. You can also find the code used to generate the dataset [here](https://github.com/davanstrien/haiku-dpo). The main dataset for this project is at [davanstrien/haiku_dpo](https://huggingface.co/datasets/davanstrien/haiku_dpo).
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** Daniel van Strien
- **Language(s) (NLP):** English (synthetically generated)
- **License:** CC-BY-4.0
### Dataset Sources
- **Repository:** https://github.com/davanstrien/haiku-dpo
## Uses
### Direct Use
This dataset can be used to generate haikus about a given topic. The prompts are used as part of a wider project that uses these prompts as seeds to generate haikus.
### Out-of-Scope Use
This dataset is primarily intended for my own and others' learning. You could use it for other purposes but before doing this, I would suggest you validate the prompts to ensure that they are suitable for your use case.
## Dataset Structure
This dataset has one split and a single configuration. A single row of the dataset looks like this:
```python
{'instructions': 'Can you compose a haiku about the serenity of mountain peaks?'}
```
## Dataset Creation
This dataset was constructed using the [distilabel](https://github.com/argilla-io/distilabel) library. It used a slightly modified version of the approach outlined in the SELF-INSTRUCT paper. The application description used was:
```python
application_description = (
"An AI assistant adept at writing Haikus. "
"It expects complete suggestions from users providing details of the kind of haiku they want. "
"The AI assistant will help users write haikus about particular topics and is willing to accept requests related to a specific subject or object or a more abstract request"
"based on an emotion, theme or vibe."
)
```
The main difference between this approach and the SELF-INSTRUCT approach is that I reformulated the task description to be more specific to the haiku generation task i.e. not asking for prompts to include step-by-step instructions. The following task description was used:
```python
"# Task Description
Develop {{ num_instructions }} user queries that can be received by the given AI application and applicable to the provided context. Emphasize diversity in verbs and linguistic structures within the model's textual capabilities.
# Criteria for Queries
Incorporate a diverse range of verbs, avoiding repetition.
Ensure queries are compatible with AI model's text generation functions and are limited to 1-2 sentences.
Design queries to be self-contained and standalone.
# AI Application
{{ application_description }}
# Context
{{ input }}
```
### Curation Rationale
This dataset was created as part of a larger effort to create a DPO dataset aimed at making LLMs better at writing haikus. This dataset is shared separately since it could be used independently of the other dataset.
#### Data Collection and Processing
No human annotators were used in the creation of this dataset. The original seed prompts were created by Daniel van Strien with help from ChatGPT-4 (used via the web interface). The actual prompts were created by [tekium](https://huggingface.co/teknium)'s [OpenHermes-2.5-Mistral-7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B) model.
#### Personal and Sensitive Information
It is very unlikely that this dataset contains any personal or sensitive information, but if you find any prompts that you believe to be harmful, please open a discussion and I will remove them from the dataset.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
Whilst I have not found any harmful prompts in the dataset, I have not manually validated all of the prompts. If you find any prompts which you believe to be harmful, please open a discussion and I will remove them from the dataset.
### Recommendations
The original seed prompts used to generate this dataset are by no means comprehensive, and the dataset is likely to be biased toward the topics covered by the seed prompts. If you would like to see more prompts about a particular topic, please open a discussion and I will add them to the seed list. In general, I focused on prompts that were more geared towards "traditional" haiku topics i.e. the natural world and the impermanence of life. If you want to use these prompts to generate a dataset of haikus about other topics, you may want to consider adding prompts that are more relevant to those topics.
## Citation
I have zero expectation that this dataset will be cited, but if you do use it in your work, please cite it as follows:
**BibTeX:**
```bibtex
@misc{vanstrien2024synthetichaikuprompts,
author = {van Strien, Daniel},
title = {Synthetic Haiku Prompts},
year = {2024},
publisher = {Hugging Face},
howpublished = {\url{https://huggingface.co/datasets/davanstrien/haiku_prompts}}
}
```
## Glossary
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
- DPO/Direct Preference Optimization: Introduced in [*Direct Preference Optimization: Your Language Model is Secretly a Reward Model*](https://huggingface.co/papers/2305.18290)
- SELF-INSTRUCT: A prompting strategy introduced in [*Self-Instruct: Aligning Language Model with Self Generated Instructions*](https://huggingface.co/papers/2212.10560)
## Dataset Card Authors
[davanstrien](https://huggingface.co/davanstrien)
## Dataset Card Contact
[davanstrien](https://huggingface.co/davanstrien) | davanstrien/haiku_prompts | [
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:en",
"license:cc-by-4.0",
"poetry",
"haiku",
"synthetic",
"distilabel",
"arxiv:2212.10560",
"arxiv:2305.18290",
"region:us"
] | 2024-01-14T11:35:13+00:00 | {"language": ["en"], "license": "cc-by-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"], "pretty_name": "Synthetic Haiku Prompts", "dataset_info": {"features": [{"name": "instructions", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 280969, "num_examples": 4303}], "download_size": 95440, "dataset_size": 280969}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["poetry", "haiku", "synthetic", "distilabel"]} | 2024-01-15T16:26:38+00:00 |
37bf15740012cbbab45946da6454abccc7aaf5cf | flowersfromthefuture/Logic-1 | [
"region:us"
] | 2024-01-14T11:38:48+00:00 | {} | 2024-01-14T11:39:04+00:00 |
|
6c75be6edfbdfd1a565f2a682e34f28e4f022917 | # Dataset Card for Dataset Name
This is a reduced variation of the truthful_qa dataset (https://huggingface.co/datasets/truthful_qa), modified to associate boolean values with the given answers, with a correct answer as a reference.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
TruthfulQA:
@misc{lin2021truthfulqa,
title={TruthfulQA: Measuring How Models Mimic Human Falsehoods},
author={Stephanie Lin and Jacob Hilton and Owain Evans},
year={2021},
eprint={2109.07958},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | nmarafo/truthful_qa_TrueFalse | [
"task_categories:table-question-answering",
"language:en",
"license:apache-2.0",
"arxiv:2109.07958",
"region:us"
] | 2024-01-14T11:40:36+00:00 | {"language": ["en"], "license": "apache-2.0", "task_categories": ["table-question-answering"]} | 2024-01-14T12:01:51+00:00 |
2789948812658161e99db2bff82bb1fc28943ac1 |
# Dataset of pompeo_magno/ポンペオ・マーニョ/庞培·马格诺 (Azur Lane)
This is the dataset of pompeo_magno/ポンペオ・マーニョ/庞培·马格诺 (Azur Lane), containing 10 images and their tags.
The core tags of this character are `bangs, pink_hair, long_hair, hat, blunt_bangs, green_eyes, ribbon, two_side_up, yellow_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 16.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pompeo_magno_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 8.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pompeo_magno_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 25 | 19.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pompeo_magno_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 14.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pompeo_magno_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 25 | 28.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pompeo_magno_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pompeo_magno_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, long_sleeves, solo, open_mouth, white_dress, barefoot, beret, black_ribbon, bow, hair_ribbon, lying, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | solo | open_mouth | white_dress | barefoot | beret | black_ribbon | bow | hair_ribbon | lying | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:-------------|:--------------|:-----------|:--------|:---------------|:------|:--------------|:--------|:--------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/pompeo_magno_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T11:41:05+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T11:47:34+00:00 |
1cfe2486214f143906e3efee03daf06114df714a |
# Dataset of haguro/羽黒/羽黑 (Azur Lane)
This is the dataset of haguro/羽黒/羽黑 (Azur Lane), containing 11 images and their tags.
The core tags of this character are `black_hair, hair_ornament, red_eyes, bangs, earrings, hairclip, breasts, ear_piercing, multicolored_hair, hair_between_eyes, streaked_hair, ponytail, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 16.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haguro_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 8.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haguro_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 28 | 19.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haguro_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 13.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haguro_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 28 | 28.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haguro_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/haguro_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | midriff, 1girl, black_choker, crop_top, jewelry, navel, solo, black_shirt, looking_at_viewer, piercing, short_sleeves, pleated_skirt, belt, black_serafuku, black_skirt, stomach, black_nails, black_sailor_collar, blush, closed_mouth, collarbone, holding, nail_polish, purple_neckerchief, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | midriff | 1girl | black_choker | crop_top | jewelry | navel | solo | black_shirt | looking_at_viewer | piercing | short_sleeves | pleated_skirt | belt | black_serafuku | black_skirt | stomach | black_nails | black_sailor_collar | blush | closed_mouth | collarbone | holding | nail_polish | purple_neckerchief | sitting |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------|:--------|:---------------|:-----------|:----------|:--------|:-------|:--------------|:--------------------|:-----------|:----------------|:----------------|:-------|:-----------------|:--------------|:----------|:--------------|:----------------------|:--------|:---------------|:-------------|:----------|:--------------|:---------------------|:----------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/haguro_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T11:41:17+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T11:45:16+00:00 |
789cedf394c999d699742a00ac28bfe460d8ab35 |
This dataset comprises a collection of traditional Tajik names, sourced from the [Catalogue of Tajik National Names](https://kumitaizabon.tj/tg/catalogue-of-tajik-national-names). It includes a variety of names along with their translations and descriptions. The dataset is designed to serve as a resource for linguistic and cultural studies, providing insights into naming conventions and practices in Tajik culture.
Columns:
1. `Number`: A unique identifier for each name entry.
2. `Tajik`: The name in the Tajik language.
3. `Russian`: The Russian transliteration of the name.
4. `English`: The English transliteration of the name.
5. `Description`: A category indicating the gender association of the name, with possible values being 'Male' or 'Female'.
The dataset is suitable for researchers and enthusiasts interested in Central Asian cultures, linguistics, and gender studies in naming practices. | sobir-hf/tajik-names | [
"size_categories:1K<n<10K",
"language:tg",
"license:mit",
"tajik",
"names",
"region:us"
] | 2024-01-14T11:44:33+00:00 | {"language": ["tg"], "license": "mit", "size_categories": ["1K<n<10K"], "pretty_name": "Tajik National Names", "tags": ["tajik", "names"]} | 2024-01-14T12:17:35+00:00 |
5ef89f863e871e06a8648626691b5e1ecaec2705 | Miladsol/Fa-to-En | [
"region:us"
] | 2024-01-14T11:46:30+00:00 | {"dataset_info": {"features": [{"name": "fa", "dtype": "string"}, {"name": "en", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 205131405, "num_examples": 1254412}, {"name": "val", "num_bytes": 43840074, "num_examples": 268803}, {"name": "test", "num_bytes": 43840565, "num_examples": 268803}], "download_size": 185949756, "dataset_size": 292812044}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "val", "path": "data/val-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-14T11:52:46+00:00 |
|
ecf21c2885972560c663f0c7d13b63d928bc589f |
# Dataset Card for Evaluation run of AI-B/UTENA-7B-NSFW-V2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AI-B/UTENA-7B-NSFW-V2](https://huggingface.co/AI-B/UTENA-7B-NSFW-V2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AI-B__UTENA-7B-NSFW-V2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T11:48:04.187010](https://huggingface.co/datasets/open-llm-leaderboard/details_AI-B__UTENA-7B-NSFW-V2/blob/main/results_2024-01-14T11-48-04.187010.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.638083652271864,
"acc_stderr": 0.03246467430851539,
"acc_norm": 0.6431039350752417,
"acc_norm_stderr": 0.03311589246690635,
"mc1": 0.3243574051407589,
"mc1_stderr": 0.01638797677964794,
"mc2": 0.47807391011550315,
"mc2_stderr": 0.014833615164608181
},
"harness|arc:challenge|25": {
"acc": 0.6015358361774744,
"acc_stderr": 0.014306946052735565,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.0140841331181043
},
"harness|hellaswag|10": {
"acc": 0.6462856004779924,
"acc_stderr": 0.004771447244095128,
"acc_norm": 0.8454491137223661,
"acc_norm_stderr": 0.003607372606295101
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.04966570903978529,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.04966570903978529
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.032321469162244675,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.032321469162244675
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.03017680828897434,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.03017680828897434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.01619780795684805,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.01619780795684805
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.014248873549217578,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.014248873549217578
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2335195530726257,
"acc_stderr": 0.014149575348976267,
"acc_norm": 0.2335195530726257,
"acc_norm_stderr": 0.014149575348976267
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.024954184324879905,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.024954184324879905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7331189710610932,
"acc_stderr": 0.025122637608816646,
"acc_norm": 0.7331189710610932,
"acc_norm_stderr": 0.025122637608816646
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032209,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032209
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389844,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389844
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360375,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827058,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827058
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3243574051407589,
"mc1_stderr": 0.01638797677964794,
"mc2": 0.47807391011550315,
"mc2_stderr": 0.014833615164608181
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.011508957690722759
},
"harness|gsm8k|5": {
"acc": 0.42380591357088704,
"acc_stderr": 0.013611632008810366
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AI-B__UTENA-7B-NSFW-V2 | [
"region:us"
] | 2024-01-14T11:50:22+00:00 | {"pretty_name": "Evaluation run of AI-B/UTENA-7B-NSFW-V2", "dataset_summary": "Dataset automatically created during the evaluation run of model [AI-B/UTENA-7B-NSFW-V2](https://huggingface.co/AI-B/UTENA-7B-NSFW-V2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AI-B__UTENA-7B-NSFW-V2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T11:48:04.187010](https://huggingface.co/datasets/open-llm-leaderboard/details_AI-B__UTENA-7B-NSFW-V2/blob/main/results_2024-01-14T11-48-04.187010.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.638083652271864,\n \"acc_stderr\": 0.03246467430851539,\n \"acc_norm\": 0.6431039350752417,\n \"acc_norm_stderr\": 0.03311589246690635,\n \"mc1\": 0.3243574051407589,\n \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.47807391011550315,\n \"mc2_stderr\": 0.014833615164608181\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6015358361774744,\n \"acc_stderr\": 0.014306946052735565,\n \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.0140841331181043\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6462856004779924,\n \"acc_stderr\": 0.004771447244095128,\n \"acc_norm\": 0.8454491137223661,\n \"acc_norm_stderr\": 0.003607372606295101\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.04966570903978529,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.04966570903978529\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.032321469162244675,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.032321469162244675\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894444,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894444\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.03017680828897434,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.03017680828897434\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.01619780795684805,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.01619780795684805\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728742,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728742\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n \"acc_stderr\": 0.014248873549217578,\n \"acc_norm\": 0.8020434227330779,\n \"acc_norm_stderr\": 0.014248873549217578\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2335195530726257,\n \"acc_stderr\": 0.014149575348976267,\n \"acc_norm\": 0.2335195530726257,\n \"acc_norm_stderr\": 0.014149575348976267\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879905,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7331189710610932,\n \"acc_stderr\": 0.025122637608816646,\n \"acc_norm\": 0.7331189710610932,\n \"acc_norm_stderr\": 0.025122637608816646\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n \"acc_stderr\": 0.012719949543032209,\n \"acc_norm\": 0.4556714471968709,\n \"acc_norm_stderr\": 0.012719949543032209\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389844,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389844\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360375,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360375\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827058,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827058\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3243574051407589,\n \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.47807391011550315,\n \"mc2_stderr\": 0.014833615164608181\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722759\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42380591357088704,\n \"acc_stderr\": 0.013611632008810366\n }\n}\n```", "repo_url": "https://huggingface.co/AI-B/UTENA-7B-NSFW-V2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|arc:challenge|25_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|gsm8k|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hellaswag|10_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T11-48-04.187010.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["**/details_harness|winogrande|5_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T11-48-04.187010.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T11_48_04.187010", "path": ["results_2024-01-14T11-48-04.187010.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T11-48-04.187010.parquet"]}]}]} | 2024-01-14T11:50:43+00:00 |
ac0094f1b469a58d3d00d6ee4eb52447fdec412b | ShreeyaVenneti/50entries_TRAIN_CSR_AS_REFERENCE_VISUAL_TEXT_2COLUMNS_50entries | [
"region:us"
] | 2024-01-14T11:56:12+00:00 | {} | 2024-01-14T11:56:30+00:00 |
|
ccffba0a8ec08cbc97878a0f9c197f47cb01f36e | ISTNetworks/amerr_arabic | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T11:59:05+00:00 | {"license": "apache-2.0"} | 2024-01-14T12:14:02+00:00 |
|
9d411092d6b327e6c6aa4dee5515eeb40e8df28b |
# Dataset Card for Evaluation run of Weyaxi/Cosmosis-3x34B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Cosmosis-3x34B](https://huggingface.co/Weyaxi/Cosmosis-3x34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Cosmosis-3x34B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T11:59:17.025888](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Cosmosis-3x34B/blob/main/results_2024-01-14T11-59-17.025888.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7691798340940261,
"acc_stderr": 0.027910883477876437,
"acc_norm": 0.7725855380923361,
"acc_norm_stderr": 0.02844764712553433,
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.6382238408380394,
"mc2_stderr": 0.01475552588950266
},
"harness|arc:challenge|25": {
"acc": 0.6655290102389079,
"acc_stderr": 0.013787460322441377,
"acc_norm": 0.697098976109215,
"acc_norm_stderr": 0.013428241573185347
},
"harness|hellaswag|10": {
"acc": 0.6569408484365664,
"acc_stderr": 0.004737608340163403,
"acc_norm": 0.851822346146186,
"acc_norm_stderr": 0.003545499169558051
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9078947368421053,
"acc_stderr": 0.02353268597044349,
"acc_norm": 0.9078947368421053,
"acc_norm_stderr": 0.02353268597044349
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8150943396226416,
"acc_stderr": 0.02389335183446432,
"acc_norm": 0.8150943396226416,
"acc_norm_stderr": 0.02389335183446432
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9027777777777778,
"acc_stderr": 0.02477451625044016,
"acc_norm": 0.9027777777777778,
"acc_norm_stderr": 0.02477451625044016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.026355158413349417,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.026355158413349417
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.701058201058201,
"acc_stderr": 0.023577604791655802,
"acc_norm": 0.701058201058201,
"acc_norm_stderr": 0.023577604791655802
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5873015873015873,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.5873015873015873,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9064516129032258,
"acc_stderr": 0.016565754668270972,
"acc_norm": 0.9064516129032258,
"acc_norm_stderr": 0.016565754668270972
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6354679802955665,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.6354679802955665,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8727272727272727,
"acc_stderr": 0.02602465765165619,
"acc_norm": 0.8727272727272727,
"acc_norm_stderr": 0.02602465765165619
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.017646526677233335,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.017646526677233335
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527033,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527033
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.019457390787681803,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.019457390787681803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.030242862397654002,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.030242862397654002
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8697478991596639,
"acc_stderr": 0.021863258494852118,
"acc_norm": 0.8697478991596639,
"acc_norm_stderr": 0.021863258494852118
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.040802441856289694,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.040802441856289694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9155963302752294,
"acc_stderr": 0.011918819327334879,
"acc_norm": 0.9155963302752294,
"acc_norm_stderr": 0.011918819327334879
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6620370370370371,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.6620370370370371,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065494,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597446,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597446
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.027285246312758957,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.027285246312758957
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8773006134969326,
"acc_stderr": 0.025777328426978927,
"acc_norm": 0.8773006134969326,
"acc_norm_stderr": 0.025777328426978927
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.625,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.9223300970873787,
"acc_stderr": 0.026501440784762752,
"acc_norm": 0.9223300970873787,
"acc_norm_stderr": 0.026501440784762752
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.01700436856813234,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.01700436856813234
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9054916985951469,
"acc_stderr": 0.01046101533819307,
"acc_norm": 0.9054916985951469,
"acc_norm_stderr": 0.01046101533819307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.838150289017341,
"acc_stderr": 0.019829299214925416,
"acc_norm": 0.838150289017341,
"acc_norm_stderr": 0.019829299214925416
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7720670391061453,
"acc_stderr": 0.014030149950805097,
"acc_norm": 0.7720670391061453,
"acc_norm_stderr": 0.014030149950805097
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8594771241830066,
"acc_stderr": 0.019899435463539946,
"acc_norm": 0.8594771241830066,
"acc_norm_stderr": 0.019899435463539946
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8360128617363344,
"acc_stderr": 0.021029576464662695,
"acc_norm": 0.8360128617363344,
"acc_norm_stderr": 0.021029576464662695
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.01810541409432967,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.01810541409432967
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6560283687943262,
"acc_stderr": 0.02833801742861133,
"acc_norm": 0.6560283687943262,
"acc_norm_stderr": 0.02833801742861133
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6160365058670143,
"acc_stderr": 0.01242158783313423,
"acc_norm": 0.6160365058670143,
"acc_norm_stderr": 0.01242158783313423
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8308823529411765,
"acc_stderr": 0.022770868010113018,
"acc_norm": 0.8308823529411765,
"acc_norm_stderr": 0.022770868010113018
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.01553374508338279,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.01553374508338279
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8408163265306122,
"acc_stderr": 0.02342097206916635,
"acc_norm": 0.8408163265306122,
"acc_norm_stderr": 0.02342097206916635
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276908,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276908
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.6382238408380394,
"mc2_stderr": 0.01475552588950266
},
"harness|winogrande|5": {
"acc": 0.8413575374901342,
"acc_stderr": 0.010267936243028214
},
"harness|gsm8k|5": {
"acc": 0.7225170583775588,
"acc_stderr": 0.01233344758104755
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__Cosmosis-3x34B | [
"region:us"
] | 2024-01-14T12:01:27+00:00 | {"pretty_name": "Evaluation run of Weyaxi/Cosmosis-3x34B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Cosmosis-3x34B](https://huggingface.co/Weyaxi/Cosmosis-3x34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Cosmosis-3x34B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T11:59:17.025888](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Cosmosis-3x34B/blob/main/results_2024-01-14T11-59-17.025888.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7691798340940261,\n \"acc_stderr\": 0.027910883477876437,\n \"acc_norm\": 0.7725855380923361,\n \"acc_norm_stderr\": 0.02844764712553433,\n \"mc1\": 0.4663402692778458,\n \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6382238408380394,\n \"mc2_stderr\": 0.01475552588950266\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6655290102389079,\n \"acc_stderr\": 0.013787460322441377,\n \"acc_norm\": 0.697098976109215,\n \"acc_norm_stderr\": 0.013428241573185347\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6569408484365664,\n \"acc_stderr\": 0.004737608340163403,\n \"acc_norm\": 0.851822346146186,\n \"acc_norm_stderr\": 0.003545499169558051\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9078947368421053,\n \"acc_stderr\": 0.02353268597044349,\n \"acc_norm\": 0.9078947368421053,\n \"acc_norm_stderr\": 0.02353268597044349\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8150943396226416,\n \"acc_stderr\": 0.02389335183446432,\n \"acc_norm\": 0.8150943396226416,\n \"acc_norm_stderr\": 0.02389335183446432\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9027777777777778,\n \"acc_stderr\": 0.02477451625044016,\n \"acc_norm\": 0.9027777777777778,\n \"acc_norm_stderr\": 0.02477451625044016\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.026355158413349417,\n \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.026355158413349417\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.034165204477475494,\n \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.034165204477475494\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.701058201058201,\n \"acc_stderr\": 0.023577604791655802,\n \"acc_norm\": 0.701058201058201,\n \"acc_norm_stderr\": 0.023577604791655802\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5873015873015873,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.5873015873015873,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9064516129032258,\n \"acc_stderr\": 0.016565754668270972,\n \"acc_norm\": 0.9064516129032258,\n \"acc_norm_stderr\": 0.016565754668270972\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6354679802955665,\n \"acc_stderr\": 0.0338640574606209,\n \"acc_norm\": 0.6354679802955665,\n \"acc_norm_stderr\": 0.0338640574606209\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.02602465765165619,\n \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.02602465765165619\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9343434343434344,\n \"acc_stderr\": 0.017646526677233335,\n \"acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.017646526677233335\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8205128205128205,\n \"acc_stderr\": 0.019457390787681803,\n \"acc_norm\": 0.8205128205128205,\n \"acc_norm_stderr\": 0.019457390787681803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.030242862397654002,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.030242862397654002\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8697478991596639,\n \"acc_stderr\": 0.021863258494852118,\n \"acc_norm\": 0.8697478991596639,\n \"acc_norm_stderr\": 0.021863258494852118\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.48344370860927155,\n \"acc_stderr\": 0.040802441856289694,\n \"acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.040802441856289694\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9155963302752294,\n \"acc_stderr\": 0.011918819327334879,\n \"acc_norm\": 0.9155963302752294,\n \"acc_norm_stderr\": 0.011918819327334879\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6620370370370371,\n \"acc_stderr\": 0.03225941352631295,\n \"acc_norm\": 0.6620370370370371,\n \"acc_norm_stderr\": 0.03225941352631295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065494,\n \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065494\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597446,\n \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597446\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8773006134969326,\n \"acc_stderr\": 0.025777328426978927,\n \"acc_norm\": 0.8773006134969326,\n \"acc_norm_stderr\": 0.025777328426978927\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9223300970873787,\n \"acc_stderr\": 0.026501440784762752,\n \"acc_norm\": 0.9223300970873787,\n \"acc_norm_stderr\": 0.026501440784762752\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n \"acc_stderr\": 0.01700436856813234,\n \"acc_norm\": 0.9273504273504274,\n \"acc_norm_stderr\": 0.01700436856813234\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9054916985951469,\n \"acc_stderr\": 0.01046101533819307,\n \"acc_norm\": 0.9054916985951469,\n \"acc_norm_stderr\": 0.01046101533819307\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.838150289017341,\n \"acc_stderr\": 0.019829299214925416,\n \"acc_norm\": 0.838150289017341,\n \"acc_norm_stderr\": 0.019829299214925416\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7720670391061453,\n \"acc_stderr\": 0.014030149950805097,\n \"acc_norm\": 0.7720670391061453,\n \"acc_norm_stderr\": 0.014030149950805097\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8594771241830066,\n \"acc_stderr\": 0.019899435463539946,\n \"acc_norm\": 0.8594771241830066,\n \"acc_norm_stderr\": 0.019899435463539946\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8360128617363344,\n \"acc_stderr\": 0.021029576464662695,\n \"acc_norm\": 0.8360128617363344,\n \"acc_norm_stderr\": 0.021029576464662695\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.01810541409432967,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.01810541409432967\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6560283687943262,\n \"acc_stderr\": 0.02833801742861133,\n \"acc_norm\": 0.6560283687943262,\n \"acc_norm_stderr\": 0.02833801742861133\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6160365058670143,\n \"acc_stderr\": 0.01242158783313423,\n \"acc_norm\": 0.6160365058670143,\n \"acc_norm_stderr\": 0.01242158783313423\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8308823529411765,\n \"acc_stderr\": 0.022770868010113018,\n \"acc_norm\": 0.8308823529411765,\n \"acc_norm_stderr\": 0.022770868010113018\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8202614379084967,\n \"acc_stderr\": 0.01553374508338279,\n \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.01553374508338279\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8408163265306122,\n \"acc_stderr\": 0.02342097206916635,\n \"acc_norm\": 0.8408163265306122,\n \"acc_norm_stderr\": 0.02342097206916635\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276908,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276908\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6382238408380394,\n \"mc2_stderr\": 0.01475552588950266\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8413575374901342,\n \"acc_stderr\": 0.010267936243028214\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7225170583775588,\n \"acc_stderr\": 0.01233344758104755\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Cosmosis-3x34B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|arc:challenge|25_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|gsm8k|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hellaswag|10_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T11-59-17.025888.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["**/details_harness|winogrande|5_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T11-59-17.025888.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T11_59_17.025888", "path": ["results_2024-01-14T11-59-17.025888.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T11-59-17.025888.parquet"]}]}]} | 2024-01-14T12:01:57+00:00 |
688f7995dc128513f70b4d08b5c0e4d727641f49 |
# Dataset Card for Evaluation run of FelixChao/NarutoDolphin-10B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/NarutoDolphin-10B](https://huggingface.co/FelixChao/NarutoDolphin-10B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__NarutoDolphin-10B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T12:12:30.168914](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__NarutoDolphin-10B/blob/main/results_2024-01-14T12-12-30.168914.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6306583942825644,
"acc_stderr": 0.03252627508388141,
"acc_norm": 0.632276909104878,
"acc_norm_stderr": 0.03317986227116511,
"mc1": 0.40514075887392903,
"mc1_stderr": 0.01718561172775337,
"mc2": 0.5912860013096678,
"mc2_stderr": 0.015586868131613507
},
"harness|arc:challenge|25": {
"acc": 0.6220136518771331,
"acc_stderr": 0.014169664520303098,
"acc_norm": 0.6382252559726962,
"acc_norm_stderr": 0.014041957945038083
},
"harness|hellaswag|10": {
"acc": 0.6542521410077674,
"acc_stderr": 0.0047463946133845325,
"acc_norm": 0.841665006970723,
"acc_norm_stderr": 0.0036430875292137216
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137282,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137282
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6528301886792452,
"acc_stderr": 0.029300101705549652,
"acc_norm": 0.6528301886792452,
"acc_norm_stderr": 0.029300101705549652
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895536,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895536
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124484,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124484
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396993,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396993
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763744,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763744
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612907,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612907
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640773,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640773
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.01374079725857982,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.01374079725857982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3642458100558659,
"acc_stderr": 0.0160943387684746,
"acc_norm": 0.3642458100558659,
"acc_norm_stderr": 0.0160943387684746
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399662,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399662
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.02563082497562135,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.02563082497562135
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.012687818419599923,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.012687818419599923
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013014,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013014
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40514075887392903,
"mc1_stderr": 0.01718561172775337,
"mc2": 0.5912860013096678,
"mc2_stderr": 0.015586868131613507
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126735
},
"harness|gsm8k|5": {
"acc": 0.5943896891584534,
"acc_stderr": 0.013524848894462115
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_FelixChao__NarutoDolphin-10B | [
"region:us"
] | 2024-01-14T12:14:45+00:00 | {"pretty_name": "Evaluation run of FelixChao/NarutoDolphin-10B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/NarutoDolphin-10B](https://huggingface.co/FelixChao/NarutoDolphin-10B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__NarutoDolphin-10B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T12:12:30.168914](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__NarutoDolphin-10B/blob/main/results_2024-01-14T12-12-30.168914.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6306583942825644,\n \"acc_stderr\": 0.03252627508388141,\n \"acc_norm\": 0.632276909104878,\n \"acc_norm_stderr\": 0.03317986227116511,\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.5912860013096678,\n \"mc2_stderr\": 0.015586868131613507\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6220136518771331,\n \"acc_stderr\": 0.014169664520303098,\n \"acc_norm\": 0.6382252559726962,\n \"acc_norm_stderr\": 0.014041957945038083\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6542521410077674,\n \"acc_stderr\": 0.0047463946133845325,\n \"acc_norm\": 0.841665006970723,\n \"acc_norm_stderr\": 0.0036430875292137216\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137282,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137282\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6528301886792452,\n \"acc_stderr\": 0.029300101705549652,\n \"acc_norm\": 0.6528301886792452,\n \"acc_norm_stderr\": 0.029300101705549652\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895536,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895536\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.03252909619613197,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.03252909619613197\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124484,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124484\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396993,\n \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396993\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763744,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763744\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612907,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612907\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640773,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640773\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.01374079725857982,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.01374079725857982\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3642458100558659,\n \"acc_stderr\": 0.0160943387684746,\n \"acc_norm\": 0.3642458100558659,\n \"acc_norm_stderr\": 0.0160943387684746\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399662,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399662\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.02563082497562135,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.02563082497562135\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n \"acc_stderr\": 0.012687818419599923,\n \"acc_norm\": 0.44328552803129073,\n \"acc_norm_stderr\": 0.012687818419599923\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.027686913588013014,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.027686913588013014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.5912860013096678,\n \"mc2_stderr\": 0.015586868131613507\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126735\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5943896891584534,\n \"acc_stderr\": 0.013524848894462115\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/NarutoDolphin-10B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|arc:challenge|25_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|gsm8k|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hellaswag|10_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T12-12-30.168914.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["**/details_harness|winogrande|5_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T12-12-30.168914.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T12_12_30.168914", "path": ["results_2024-01-14T12-12-30.168914.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T12-12-30.168914.parquet"]}]}]} | 2024-01-14T12:15:06+00:00 |
440236fb2384eeab89b92263b7b87d5fe0b3d2a4 | # Dataset Card for "vietnamese-retrieval-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | thanhdath/vietnamese-retrieval-v2 | [
"region:us"
] | 2024-01-14T12:15:06+00:00 | {"dataset_info": {"features": [{"name": "query_id", "dtype": "string"}, {"name": "query", "dtype": "string"}, {"name": "positive_passages", "list": [{"name": "docid", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "title", "dtype": "string"}]}, {"name": "negative_passages", "list": [{"name": "docid", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "title", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 6347713728, "num_examples": 574167}], "download_size": 2987843540, "dataset_size": 6347713728}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T12:22:46+00:00 |
247fa39af45c07c87ca725e459203f6f904097bc | ShreeyaVenneti/50entries_TRAIN_2COLUMNS_VISUAL_ACOUSTIC_TEXT_AS_PER_CSR_REFERENCE | [
"region:us"
] | 2024-01-14T12:25:53+00:00 | {} | 2024-01-14T12:26:07+00:00 |
|
9f6aa988ad020c2e0545b6d164f4a3ccebe0e018 | moghadas76/metrla | [
"region:us"
] | 2024-01-14T12:26:14+00:00 | {} | 2024-01-14T12:26:14+00:00 |
|
46f68ae1897088c7ddd17402d776a980469c77d9 |
# Dataset Card for Evaluation run of dhanushreddy29/BrokenKeyboardMerge
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [dhanushreddy29/BrokenKeyboardMerge](https://huggingface.co/dhanushreddy29/BrokenKeyboardMerge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhanushreddy29__BrokenKeyboardMerge",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T12:28:57.888363](https://huggingface.co/datasets/open-llm-leaderboard/details_dhanushreddy29__BrokenKeyboardMerge/blob/main/results_2024-01-14T12-28-57.888363.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5820084848111404,
"acc_stderr": 0.033007700619969375,
"acc_norm": 0.5876752361456778,
"acc_norm_stderr": 0.03370856721497056,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.520009813591209,
"mc2_stderr": 0.01568688657303073
},
"harness|arc:challenge|25": {
"acc": 0.5639931740614335,
"acc_stderr": 0.014491225699230916,
"acc_norm": 0.5972696245733788,
"acc_norm_stderr": 0.01433223630679015
},
"harness|hellaswag|10": {
"acc": 0.6292571200955985,
"acc_stderr": 0.0048201660022530795,
"acc_norm": 0.8124875522804222,
"acc_norm_stderr": 0.0038952463204527657
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939391,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939391
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02510742548113729,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02510742548113729
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.025649381063029265,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.025649381063029265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.034454876862647144,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.034454876862647144
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5307692307692308,
"acc_stderr": 0.025302958890850154,
"acc_norm": 0.5307692307692308,
"acc_norm_stderr": 0.025302958890850154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.01886188502153473,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.01886188502153473
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.02955429260569507,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.02955429260569507
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990945,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990945
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.02363687331748927,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.02363687331748927
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946005,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946005
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3217877094972067,
"acc_stderr": 0.015624236160792577,
"acc_norm": 0.3217877094972067,
"acc_norm_stderr": 0.015624236160792577
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.027684181883302888,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.027684181883302888
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4152542372881356,
"acc_stderr": 0.012585471793400664,
"acc_norm": 0.4152542372881356,
"acc_norm_stderr": 0.012585471793400664
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.019780465954777515,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.019780465954777515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.03038726291954773,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.03038726291954773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.030965903123573012,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.030965903123573012
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533193,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.520009813591209,
"mc2_stderr": 0.01568688657303073
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.011508957690722747
},
"harness|gsm8k|5": {
"acc": 0.25928733889310085,
"acc_stderr": 0.012071405369905504
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_dhanushreddy29__BrokenKeyboardMerge | [
"region:us"
] | 2024-01-14T12:31:15+00:00 | {"pretty_name": "Evaluation run of dhanushreddy29/BrokenKeyboardMerge", "dataset_summary": "Dataset automatically created during the evaluation run of model [dhanushreddy29/BrokenKeyboardMerge](https://huggingface.co/dhanushreddy29/BrokenKeyboardMerge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhanushreddy29__BrokenKeyboardMerge\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T12:28:57.888363](https://huggingface.co/datasets/open-llm-leaderboard/details_dhanushreddy29__BrokenKeyboardMerge/blob/main/results_2024-01-14T12-28-57.888363.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5820084848111404,\n \"acc_stderr\": 0.033007700619969375,\n \"acc_norm\": 0.5876752361456778,\n \"acc_norm_stderr\": 0.03370856721497056,\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.520009813591209,\n \"mc2_stderr\": 0.01568688657303073\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5639931740614335,\n \"acc_stderr\": 0.014491225699230916,\n \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.01433223630679015\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6292571200955985,\n \"acc_stderr\": 0.0048201660022530795,\n \"acc_norm\": 0.8124875522804222,\n \"acc_norm_stderr\": 0.0038952463204527657\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.02510742548113729,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02510742548113729\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n \"acc_stderr\": 0.025649381063029265,\n \"acc_norm\": 0.7161290322580646,\n \"acc_norm_stderr\": 0.025649381063029265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.034454876862647144,\n \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.034454876862647144\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850154,\n \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850154\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.0322529423239964,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.0322529423239964\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7376146788990826,\n \"acc_stderr\": 0.01886188502153473,\n \"acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.01886188502153473\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.02955429260569507,\n \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.02955429260569507\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990945,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990945\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.02363687331748927,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.02363687331748927\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n \"acc_stderr\": 0.014987270640946005,\n \"acc_norm\": 0.7726692209450831,\n \"acc_norm_stderr\": 0.014987270640946005\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3217877094972067,\n \"acc_stderr\": 0.015624236160792577,\n \"acc_norm\": 0.3217877094972067,\n \"acc_norm_stderr\": 0.015624236160792577\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302888,\n \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302888\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4152542372881356,\n \"acc_stderr\": 0.012585471793400664,\n \"acc_norm\": 0.4152542372881356,\n \"acc_norm_stderr\": 0.012585471793400664\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.019780465954777515,\n \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.019780465954777515\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.03038726291954773,\n \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.03038726291954773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n \"acc_stderr\": 0.030965903123573012,\n \"acc_norm\": 0.7412935323383084,\n \"acc_norm_stderr\": 0.030965903123573012\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533193,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533193\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.520009813591209,\n \"mc2_stderr\": 0.01568688657303073\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722747\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.25928733889310085,\n \"acc_stderr\": 0.012071405369905504\n }\n}\n```", "repo_url": "https://huggingface.co/dhanushreddy29/BrokenKeyboardMerge", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|arc:challenge|25_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|gsm8k|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hellaswag|10_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T12-28-57.888363.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["**/details_harness|winogrande|5_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T12-28-57.888363.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T12_28_57.888363", "path": ["results_2024-01-14T12-28-57.888363.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T12-28-57.888363.parquet"]}]}]} | 2024-01-14T12:31:36+00:00 |
1588e143af96f49e0c7c2bb2cb69759e5a7b5bca |
# Dataset Card for MS COCO Karpathy in German language
This dataset contains captions that were machine translated using [opus-mt-en-de](https://huggingface.co/Helsinki-NLP/opus-mt-en-de).
## Dataset Details
### Dataset Description
- **Curated by:** {{ curators | default("[More Information Needed]", true)}}
- **Language(s) (NLP):** {{ language | default("[More Information Needed]", true)}}
- **License:** {{ license | default("[More Information Needed]", true)}}
### Dataset Sources
The processed [MS COCO datasets](https://cocodataset.org/#download) (Karpathy Split) in this repo are based on the following sources:
| Type | MD5 | URL |
|------------|----------------------------------|-----------------------------------------------------------------------------------------------|
| Train | aa31ac474cf6250ebb81d18348a07ed8 | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_train.json |
| Validation | b273847456ef5580e33713b1f7de52a0 | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_val.json |
| Test | 3ff34b0ef2db02d01c37399f6a2a6cd1 | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_test.json |
MS COCO:
- **Download:** https://cocodataset.org/#download
- **Paper:** http://arxiv.org/abs/1405.0312
## Dataset Creation
This dataset was generated by processing the annotations via [opus-mt-en-de](https://huggingface.co/Helsinki-NLP/opus-mt-en-de).
| Jotschi/coco-karpathy-opus-de | [
"task_categories:text-generation",
"task_categories:image-to-text",
"task_categories:text-to-image",
"annotations_creators:machine-generated",
"size_categories:n<650k",
"source_datasets:mscoco",
"language:de",
"coco",
"mscoco",
"german",
"arxiv:1405.0312",
"region:us"
] | 2024-01-14T12:38:08+00:00 | {"annotations_creators": ["machine-generated"], "language": ["de"], "size_categories": ["n<650k"], "source_datasets": ["mscoco"], "task_categories": ["text-generation", "image-to-text", "text-to-image"], "pretty_name": "MS COCO Karpathy in german", "license_name": "cc-by-4.0", "license_link": "https://creativecommons.org/licenses/by/4.0/legalcode", "tags": ["coco", "mscoco", "german"]} | 2024-01-14T13:10:49+00:00 |
e92d4976a315449c947645576bc5343043604c29 |
# Dataset of suffren/シュフラン/絮弗伦 (Azur Lane)
This is the dataset of suffren/シュフラン/絮弗伦 (Azur Lane), containing 25 images and their tags.
The core tags of this character are `long_hair, breasts, bangs, horns, blue_eyes, large_breasts, white_hair, very_long_hair, fang, grey_hair, twintails, two_side_up`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 25 | 41.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suffren_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 25 | 23.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suffren_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 64 | 49.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suffren_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 25 | 36.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suffren_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 64 | 71.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suffren_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/suffren_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, open_mouth, solo, smile, blush, gloves, arm_up, holding, thighhighs, white_dress, armored_boots, gauntlets, grey_eyes, hair_ornament, see-through, shoulder_armor, simple_background, tail, thighs, white_background |
| 1 | 13 |  |  |  |  |  | 1girl, solo, blush, open_mouth, long_sleeves, looking_at_viewer, navel, hoodie, pleated_skirt, miniskirt, blue_skirt, cleavage, hair_ribbon, hood_down, stomach, ahoge, belt, black_bikini, black_thighhighs, hair_bow, thighs, underwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | open_mouth | solo | smile | blush | gloves | arm_up | holding | thighhighs | white_dress | armored_boots | gauntlets | grey_eyes | hair_ornament | see-through | shoulder_armor | simple_background | tail | thighs | white_background | long_sleeves | navel | hoodie | pleated_skirt | miniskirt | blue_skirt | cleavage | hair_ribbon | hood_down | stomach | ahoge | belt | black_bikini | black_thighhighs | hair_bow | underwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------------|:-------|:--------|:--------|:---------|:---------|:----------|:-------------|:--------------|:----------------|:------------|:------------|:----------------|:--------------|:-----------------|:--------------------|:-------|:---------|:-------------------|:---------------|:--------|:---------|:----------------|:------------|:-------------|:-----------|:--------------|:------------|:----------|:--------|:-------|:---------------|:-------------------|:-----------|:------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/suffren_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T12:43:21+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T12:49:02+00:00 |
22245e01c2b02ae0baea7ad1e53a5306e8dcd6ff | LightFury9/transliteration-telugu-words | [
"region:us"
] | 2024-01-14T12:53:26+00:00 | {"dataset_info": {"features": [{"name": "unique_identifier", "dtype": "string"}, {"name": "native word", "dtype": "string"}, {"name": "english word", "dtype": "string"}, {"name": "source", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 190122662, "num_examples": 2429562}, {"name": "test", "num_bytes": 661473, "num_examples": 10260}, {"name": "validation", "num_bytes": 507490, "num_examples": 7681}], "download_size": 91663334, "dataset_size": 191291625}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T12:53:54+00:00 |
|
d03648b950969217a6f31e81f01a4702f1e5a9bd |
# Dataset Card for Visual Genome Annotations in German language
This dataset contains captions that were machine translated using [opus-mt-en-de](https://huggingface.co/Helsinki-NLP/opus-mt-en-de).
## Dataset Details
### Dataset Description
- **Curated by:** {{ curators | default("[More Information Needed]", true)}}
- **Language(s) (NLP):** {{ language | default("[More Information Needed]", true)}}
- **License:** {{ license | default("[More Information Needed]", true)}}
### Dataset Sources
The processed [Visual Genome](https://homes.cs.washington.edu/~ranjay/visualgenome/index.html) captions in this repo are based on the following sources:
* 941425b651f50cdb1a6f0673eaab6260 vg_caption.json (https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/visual_genome/vg_caption.json)
Visual Genome:
- **Download:** https://homes.cs.washington.edu/~ranjay/visualgenome/index.html
- **Paper:** https://link.springer.com/article/10.1007/s11263-016-0981-7
## Dataset Creation
This dataset was generated by processing the annotations via [opus-mt-en-de](https://huggingface.co/Helsinki-NLP/opus-mt-en-de).
| Jotschi/visual_genome-opus-de | [
"task_categories:text-generation",
"task_categories:image-to-text",
"task_categories:text-to-image",
"annotations_creators:machine-generated",
"size_categories:n<820k",
"source_datasets:visual_genome",
"language:de",
"visual_genome",
"german",
"region:us"
] | 2024-01-14T12:54:49+00:00 | {"annotations_creators": ["machine-generated"], "language": ["de"], "size_categories": ["n<820k"], "source_datasets": ["visual_genome"], "task_categories": ["text-generation", "image-to-text", "text-to-image"], "pretty_name": "Visual Genome in german", "license_name": "cc-by-4.0", "license_link": "https://creativecommons.org/licenses/by/4.0/legalcode", "tags": ["visual_genome", "german"]} | 2024-01-14T13:03:37+00:00 |
3851c84dbc1397b8a30e27f098f4a019722b2cd9 |
# Dataset of vincennes/ヴィンセンス/文森斯 (Azur Lane)
This is the dataset of vincennes/ヴィンセンス/文森斯 (Azur Lane), containing 17 images and their tags.
The core tags of this character are `blue_eyes, long_hair, blue_hair, bangs, twintails, hair_ornament, breasts, sidelocks, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 11.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vincennes_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 9.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vincennes_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 28 | 15.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vincennes_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 11.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vincennes_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 28 | 18.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vincennes_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/vincennes_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | blush, 1girl, looking_at_viewer, solo, white_background, full_body, long_sleeves, skirt, black_jacket, black_thighhighs, simple_background, closed_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | 1girl | looking_at_viewer | solo | white_background | full_body | long_sleeves | skirt | black_jacket | black_thighhighs | simple_background | closed_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:-------------------|:------------|:---------------|:--------|:---------------|:-------------------|:--------------------|:---------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/vincennes_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T13:02:18+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T13:05:55+00:00 |
38fb5e6ffbcfbe3ae0a54a1d6ae801e9cdc397c0 |
# Dataset of u_73/U-73 (Azur Lane)
This is the dataset of u_73/U-73 (Azur Lane), containing 15 images and their tags.
The core tags of this character are `long_hair, red_eyes, black_hair, breasts, bangs, hat, very_long_hair, one_side_up`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 14.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_73_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 11.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_73_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 31 | 20.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_73_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 13.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_73_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 31 | 22.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_73_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/u_73_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | looking_at_viewer, smile, 1girl, blush, solo, jacket, navel, swimsuit, black_thighhighs, fingerless_gloves, holding, skirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | smile | 1girl | blush | solo | jacket | navel | swimsuit | black_thighhighs | fingerless_gloves | holding | skirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:--------|:--------|:--------|:-------|:---------|:--------|:-----------|:-------------------|:--------------------|:----------|:--------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/u_73_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T13:02:20+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T13:06:07+00:00 |
e2c96891d4e9ae0a0e488509d41ee66acb62fd20 | amzar1303/zigwheels.my-images | [
"region:us"
] | 2024-01-14T13:07:24+00:00 | {} | 2024-01-14T13:08:10+00:00 |
|
88001a7f67649e25f6c0b6324eab0c0cdc849509 | LishaNM/Mistral_dataset | [
"task_categories:text-generation",
"license:apache-2.0",
"region:us"
] | 2024-01-14T13:09:39+00:00 | {"license": "apache-2.0", "task_categories": ["text-generation"]} | 2024-01-23T10:09:54+00:00 |
|
cc327c8c9494d7ba7bfdf3a90ea79661b3e6d2ca |
# Dataset Card for Evaluation run of AI-B/UTENA-7B-V3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AI-B/UTENA-7B-V3](https://huggingface.co/AI-B/UTENA-7B-V3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AI-B__UTENA-7B-V3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T13:18:55.824915](https://huggingface.co/datasets/open-llm-leaderboard/details_AI-B__UTENA-7B-V3/blob/main/results_2024-01-14T13-18-55.824915.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.647960372928455,
"acc_stderr": 0.03225433281918251,
"acc_norm": 0.6510088264996041,
"acc_norm_stderr": 0.03289983967691888,
"mc1": 0.3708690330477356,
"mc1_stderr": 0.016909693580248818,
"mc2": 0.5364232527046322,
"mc2_stderr": 0.01504213226474297
},
"harness|arc:challenge|25": {
"acc": 0.6271331058020477,
"acc_stderr": 0.014131176760131167,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.013847460518892978
},
"harness|hellaswag|10": {
"acc": 0.6607249551882095,
"acc_stderr": 0.0047249566658799725,
"acc_norm": 0.8570005974905397,
"acc_norm_stderr": 0.0034935679140932906
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406786,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406786
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895518,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659356,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659356
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857413,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857413
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501534,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229092,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229092
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3564245810055866,
"acc_stderr": 0.016018239710513405,
"acc_norm": 0.3564245810055866,
"acc_norm_stderr": 0.016018239710513405
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7491961414790996,
"acc_stderr": 0.024619771956697168,
"acc_norm": 0.7491961414790996,
"acc_norm_stderr": 0.024619771956697168
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.455019556714472,
"acc_stderr": 0.012718456618701768,
"acc_norm": 0.455019556714472,
"acc_norm_stderr": 0.012718456618701768
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174923,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174923
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3708690330477356,
"mc1_stderr": 0.016909693580248818,
"mc2": 0.5364232527046322,
"mc2_stderr": 0.01504213226474297
},
"harness|winogrande|5": {
"acc": 0.8026835043409629,
"acc_stderr": 0.011185026389050376
},
"harness|gsm8k|5": {
"acc": 0.5420773313115997,
"acc_stderr": 0.013723629649844075
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AI-B__UTENA-7B-V3 | [
"region:us"
] | 2024-01-14T13:10:22+00:00 | {"pretty_name": "Evaluation run of AI-B/UTENA-7B-V3", "dataset_summary": "Dataset automatically created during the evaluation run of model [AI-B/UTENA-7B-V3](https://huggingface.co/AI-B/UTENA-7B-V3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AI-B__UTENA-7B-V3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T13:18:55.824915](https://huggingface.co/datasets/open-llm-leaderboard/details_AI-B__UTENA-7B-V3/blob/main/results_2024-01-14T13-18-55.824915.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.647960372928455,\n \"acc_stderr\": 0.03225433281918251,\n \"acc_norm\": 0.6510088264996041,\n \"acc_norm_stderr\": 0.03289983967691888,\n \"mc1\": 0.3708690330477356,\n \"mc1_stderr\": 0.016909693580248818,\n \"mc2\": 0.5364232527046322,\n \"mc2_stderr\": 0.01504213226474297\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.014131176760131167,\n \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892978\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6607249551882095,\n \"acc_stderr\": 0.0047249566658799725,\n \"acc_norm\": 0.8570005974905397,\n \"acc_norm_stderr\": 0.0034935679140932906\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249386,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249386\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406786,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406786\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895518,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659356,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659356\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857413,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857413\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501534,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501534\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3564245810055866,\n \"acc_stderr\": 0.016018239710513405,\n \"acc_norm\": 0.3564245810055866,\n \"acc_norm_stderr\": 0.016018239710513405\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7491961414790996,\n \"acc_stderr\": 0.024619771956697168,\n \"acc_norm\": 0.7491961414790996,\n \"acc_norm_stderr\": 0.024619771956697168\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.455019556714472,\n \"acc_stderr\": 0.012718456618701768,\n \"acc_norm\": 0.455019556714472,\n \"acc_norm_stderr\": 0.012718456618701768\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174923,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174923\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3708690330477356,\n \"mc1_stderr\": 0.016909693580248818,\n \"mc2\": 0.5364232527046322,\n \"mc2_stderr\": 0.01504213226474297\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8026835043409629,\n \"acc_stderr\": 0.011185026389050376\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5420773313115997,\n \"acc_stderr\": 0.013723629649844075\n }\n}\n```", "repo_url": "https://huggingface.co/AI-B/UTENA-7B-V3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|arc:challenge|25_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|arc:challenge|25_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|gsm8k|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|gsm8k|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hellaswag|10_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hellaswag|10_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T13-08-02.429354.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T13-18-55.824915.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["**/details_harness|winogrande|5_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["**/details_harness|winogrande|5_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T13-18-55.824915.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T13_08_02.429354", "path": ["results_2024-01-14T13-08-02.429354.parquet"]}, {"split": "2024_01_14T13_18_55.824915", "path": ["results_2024-01-14T13-18-55.824915.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T13-18-55.824915.parquet"]}]}]} | 2024-01-14T13:21:15+00:00 |
e91707ac0e98f1062052b4a487edbbd579d36111 | Sussyb/Ritsu | [
"region:us"
] | 2024-01-14T13:10:58+00:00 | {} | 2024-01-14T13:27:34+00:00 |
|
e5c10415d5c8b1238e65ff808139f911a3816286 |
# Dataset Card for Visual Genome Annotations in Simple English
This dataset contains captions that were rephrased into simple english so that a young child would understand it.
## Dataset Details
### Dataset Description
- **Curated by:** {{ curators | default("[More Information Needed]", true)}}
- **Language(s) (NLP):** {{ language | default("[More Information Needed]", true)}}
- **License:** {{ license | default("[More Information Needed]", true)}}
### Dataset Sources
The processed [Visual Genome](https://homes.cs.washington.edu/~ranjay/visualgenome/index.html) captions in this repo are based on the following sources:
* 941425b651f50cdb1a6f0673eaab6260 vg_caption.json (https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/visual_genome/vg_caption.json)
Visual Genome:
- **Download:** https://homes.cs.washington.edu/~ranjay/visualgenome/index.html
- **Paper:** https://link.springer.com/article/10.1007/s11263-016-0981-7
## Dataset Creation
This dataset was generated by processing the annotations via [Mistal7B](https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.2-AWQ).
Prompt used:
```
Rewrite the sentence " + caption + " for a 3 to 4 year old child. Give only one simple sentence. Don't use the word see. Give only a single answer.
```
A filter was applied to only store captions which matched the common output format. A best effort filter was applied to reduce the chance of including multiple example sentences in the output.
### Curation Rationale
This dataset is useful for experiments with small LLMs which have only a reduced corpus. The dataset is suitable to be used for LAVIS experiments (QFormer Training) with a finetuned TinyStories 33M LLM.
| Jotschi/visual_genome-simple-en | [
"task_categories:text-generation",
"task_categories:image-to-text",
"task_categories:text-to-image",
"annotations_creators:machine-generated",
"size_categories:n<820k",
"source_datasets:visual_genome",
"language:en",
"visual_genome",
"simple-english",
"region:us"
] | 2024-01-14T13:12:08+00:00 | {"annotations_creators": ["machine-generated"], "language": ["en"], "size_categories": ["n<820k"], "source_datasets": ["visual_genome"], "task_categories": ["text-generation", "image-to-text", "text-to-image"], "pretty_name": "Visual Genome in Simple English", "license_name": "cc-by-4.0", "license_link": "https://creativecommons.org/licenses/by/4.0/legalcode", "tags": ["visual_genome", "simple-english"]} | 2024-01-14T13:16:08+00:00 |
29da96129f17814ec3b7655cd63aadedad1bfafe | tomseiberth/voztom01 | [
"license:openrail",
"region:us"
] | 2024-01-14T13:13:49+00:00 | {"license": "openrail"} | 2024-01-14T13:16:15+00:00 |
|
aa630b49bf0e3481fe753540460a9d6534a160d9 | ping98k/dolly-th | [
"region:us"
] | 2024-01-14T13:17:54+00:00 | {} | 2024-01-15T14:54:06+00:00 |
|
67d20c658c5e393cb6790ee8c3d8131ff91f746a | Humaiz111/AOT_done | [
"license:mit",
"region:us"
] | 2024-01-14T13:20:52+00:00 | {"license": "mit"} | 2024-01-14T13:20:52+00:00 |
|
2fe10001aa6298f63f526566ff55b481ab71740c | HiDBH/AIproject | [
"region:us"
] | 2024-01-14T13:25:00+00:00 | {} | 2024-01-14T13:25:00+00:00 |
|
1cfc3af9dfd102776942d2a5224ff134db0af11f |
# Dataset Card for Evaluation run of Weyaxi/Helion-4x34B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Helion-4x34B](https://huggingface.co/Weyaxi/Helion-4x34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Helion-4x34B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T13:23:45.843719](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Helion-4x34B/blob/main/results_2024-01-14T13-23-45.843719.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7699592649917206,
"acc_stderr": 0.027825032662237632,
"acc_norm": 0.7733690955948024,
"acc_norm_stderr": 0.028359676428301124,
"mc1": 0.4724602203182375,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.6391431988345577,
"mc2_stderr": 0.014739254450901405
},
"harness|arc:challenge|25": {
"acc": 0.6646757679180887,
"acc_stderr": 0.013796182947785562,
"acc_norm": 0.697098976109215,
"acc_norm_stderr": 0.013428241573185349
},
"harness|hellaswag|10": {
"acc": 0.6577375024895439,
"acc_stderr": 0.004734972668299616,
"acc_norm": 0.8528181637124079,
"acc_norm_stderr": 0.0035356302890914575
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9078947368421053,
"acc_stderr": 0.02353268597044349,
"acc_norm": 0.9078947368421053,
"acc_norm_stderr": 0.02353268597044349
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8150943396226416,
"acc_stderr": 0.023893351834464324,
"acc_norm": 0.8150943396226416,
"acc_norm_stderr": 0.023893351834464324
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9027777777777778,
"acc_stderr": 0.02477451625044016,
"acc_norm": 0.9027777777777778,
"acc_norm_stderr": 0.02477451625044016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7914893617021277,
"acc_stderr": 0.02655698211783874,
"acc_norm": 0.7914893617021277,
"acc_norm_stderr": 0.02655698211783874
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6904761904761905,
"acc_stderr": 0.023809523809523867,
"acc_norm": 0.6904761904761905,
"acc_norm_stderr": 0.023809523809523867
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9,
"acc_stderr": 0.017066403719657255,
"acc_norm": 0.9,
"acc_norm_stderr": 0.017066403719657255
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6354679802955665,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.6354679802955665,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8727272727272727,
"acc_stderr": 0.02602465765165619,
"acc_norm": 0.8727272727272727,
"acc_norm_stderr": 0.02602465765165619
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.01826310542019949,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.01826310542019949
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527033,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527033
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.823076923076923,
"acc_stderr": 0.019348070174396985,
"acc_norm": 0.823076923076923,
"acc_norm_stderr": 0.019348070174396985
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03029677128606732,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03029677128606732
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8739495798319328,
"acc_stderr": 0.02155962312121393,
"acc_norm": 0.8739495798319328,
"acc_norm_stderr": 0.02155962312121393
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5033112582781457,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.5033112582781457,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9137614678899083,
"acc_stderr": 0.012035597300116248,
"acc_norm": 0.9137614678899083,
"acc_norm_stderr": 0.012035597300116248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.032365852526021574,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.032365852526021574
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.018889750550956715,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.018889750550956715
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597446,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597446
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.027285246312758957,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.027285246312758957
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8773006134969326,
"acc_stderr": 0.025777328426978927,
"acc_norm": 0.8773006134969326,
"acc_norm_stderr": 0.025777328426978927
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.625,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.9320388349514563,
"acc_stderr": 0.024919959142514478,
"acc_norm": 0.9320388349514563,
"acc_norm_stderr": 0.024919959142514478
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.016534627684311357,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.016534627684311357
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9054916985951469,
"acc_stderr": 0.01046101533819307,
"acc_norm": 0.9054916985951469,
"acc_norm_stderr": 0.01046101533819307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8410404624277457,
"acc_stderr": 0.01968530703357195,
"acc_norm": 0.8410404624277457,
"acc_norm_stderr": 0.01968530703357195
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7720670391061453,
"acc_stderr": 0.014030149950805097,
"acc_norm": 0.7720670391061453,
"acc_norm_stderr": 0.014030149950805097
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8594771241830066,
"acc_stderr": 0.019899435463539932,
"acc_norm": 0.8594771241830066,
"acc_norm_stderr": 0.019899435463539932
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8456591639871383,
"acc_stderr": 0.020519050342084722,
"acc_norm": 0.8456591639871383,
"acc_norm_stderr": 0.020519050342084722
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.01810541409432967,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.01810541409432967
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6382978723404256,
"acc_stderr": 0.028663820147199485,
"acc_norm": 0.6382978723404256,
"acc_norm_stderr": 0.028663820147199485
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.621903520208605,
"acc_stderr": 0.012384878406798095,
"acc_norm": 0.621903520208605,
"acc_norm_stderr": 0.012384878406798095
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8455882352941176,
"acc_stderr": 0.021950024722922026,
"acc_norm": 0.8455882352941176,
"acc_norm_stderr": 0.021950024722922026
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.015643069911273344,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.015643069911273344
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8448979591836735,
"acc_stderr": 0.0231747988612186,
"acc_norm": 0.8448979591836735,
"acc_norm_stderr": 0.0231747988612186
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276908,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276908
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4724602203182375,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.6391431988345577,
"mc2_stderr": 0.014739254450901405
},
"harness|winogrande|5": {
"acc": 0.8437253354380426,
"acc_stderr": 0.010205351791873504
},
"harness|gsm8k|5": {
"acc": 0.7225170583775588,
"acc_stderr": 0.012333447581047546
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__Helion-4x34B | [
"region:us"
] | 2024-01-14T13:25:54+00:00 | {"pretty_name": "Evaluation run of Weyaxi/Helion-4x34B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Helion-4x34B](https://huggingface.co/Weyaxi/Helion-4x34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Helion-4x34B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T13:23:45.843719](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Helion-4x34B/blob/main/results_2024-01-14T13-23-45.843719.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7699592649917206,\n \"acc_stderr\": 0.027825032662237632,\n \"acc_norm\": 0.7733690955948024,\n \"acc_norm_stderr\": 0.028359676428301124,\n \"mc1\": 0.4724602203182375,\n \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.6391431988345577,\n \"mc2_stderr\": 0.014739254450901405\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6646757679180887,\n \"acc_stderr\": 0.013796182947785562,\n \"acc_norm\": 0.697098976109215,\n \"acc_norm_stderr\": 0.013428241573185349\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6577375024895439,\n \"acc_stderr\": 0.004734972668299616,\n \"acc_norm\": 0.8528181637124079,\n \"acc_norm_stderr\": 0.0035356302890914575\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9078947368421053,\n \"acc_stderr\": 0.02353268597044349,\n \"acc_norm\": 0.9078947368421053,\n \"acc_norm_stderr\": 0.02353268597044349\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8150943396226416,\n \"acc_stderr\": 0.023893351834464324,\n \"acc_norm\": 0.8150943396226416,\n \"acc_norm_stderr\": 0.023893351834464324\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9027777777777778,\n \"acc_stderr\": 0.02477451625044016,\n \"acc_norm\": 0.9027777777777778,\n \"acc_norm_stderr\": 0.02477451625044016\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7914893617021277,\n \"acc_stderr\": 0.02655698211783874,\n \"acc_norm\": 0.7914893617021277,\n \"acc_norm_stderr\": 0.02655698211783874\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.034165204477475494,\n \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.034165204477475494\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6904761904761905,\n \"acc_stderr\": 0.023809523809523867,\n \"acc_norm\": 0.6904761904761905,\n \"acc_norm_stderr\": 0.023809523809523867\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.017066403719657255,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.017066403719657255\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6354679802955665,\n \"acc_stderr\": 0.0338640574606209,\n \"acc_norm\": 0.6354679802955665,\n \"acc_norm_stderr\": 0.0338640574606209\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.02602465765165619,\n \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.02602465765165619\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.01826310542019949,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.01826310542019949\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.823076923076923,\n \"acc_stderr\": 0.019348070174396985,\n \"acc_norm\": 0.823076923076923,\n \"acc_norm_stderr\": 0.019348070174396985\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03029677128606732,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03029677128606732\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8739495798319328,\n \"acc_stderr\": 0.02155962312121393,\n \"acc_norm\": 0.8739495798319328,\n \"acc_norm_stderr\": 0.02155962312121393\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5033112582781457,\n \"acc_stderr\": 0.04082393379449654,\n \"acc_norm\": 0.5033112582781457,\n \"acc_norm_stderr\": 0.04082393379449654\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9137614678899083,\n \"acc_stderr\": 0.012035597300116248,\n \"acc_norm\": 0.9137614678899083,\n \"acc_norm_stderr\": 0.012035597300116248\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.032365852526021574,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.032365852526021574\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073322,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073322\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.018889750550956715,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.018889750550956715\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597446,\n \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597446\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8773006134969326,\n \"acc_stderr\": 0.025777328426978927,\n \"acc_norm\": 0.8773006134969326,\n \"acc_norm_stderr\": 0.025777328426978927\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9320388349514563,\n \"acc_stderr\": 0.024919959142514478,\n \"acc_norm\": 0.9320388349514563,\n \"acc_norm_stderr\": 0.024919959142514478\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n \"acc_stderr\": 0.016534627684311357,\n \"acc_norm\": 0.9316239316239316,\n \"acc_norm_stderr\": 0.016534627684311357\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9054916985951469,\n \"acc_stderr\": 0.01046101533819307,\n \"acc_norm\": 0.9054916985951469,\n \"acc_norm_stderr\": 0.01046101533819307\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8410404624277457,\n \"acc_stderr\": 0.01968530703357195,\n \"acc_norm\": 0.8410404624277457,\n \"acc_norm_stderr\": 0.01968530703357195\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7720670391061453,\n \"acc_stderr\": 0.014030149950805097,\n \"acc_norm\": 0.7720670391061453,\n \"acc_norm_stderr\": 0.014030149950805097\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8594771241830066,\n \"acc_stderr\": 0.019899435463539932,\n \"acc_norm\": 0.8594771241830066,\n \"acc_norm_stderr\": 0.019899435463539932\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8456591639871383,\n \"acc_stderr\": 0.020519050342084722,\n \"acc_norm\": 0.8456591639871383,\n \"acc_norm_stderr\": 0.020519050342084722\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.01810541409432967,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.01810541409432967\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6382978723404256,\n \"acc_stderr\": 0.028663820147199485,\n \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.028663820147199485\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.621903520208605,\n \"acc_stderr\": 0.012384878406798095,\n \"acc_norm\": 0.621903520208605,\n \"acc_norm_stderr\": 0.012384878406798095\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8455882352941176,\n \"acc_stderr\": 0.021950024722922026,\n \"acc_norm\": 0.8455882352941176,\n \"acc_norm_stderr\": 0.021950024722922026\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.015643069911273344,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.015643069911273344\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8448979591836735,\n \"acc_stderr\": 0.0231747988612186,\n \"acc_norm\": 0.8448979591836735,\n \"acc_norm_stderr\": 0.0231747988612186\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276908,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276908\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4724602203182375,\n \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.6391431988345577,\n \"mc2_stderr\": 0.014739254450901405\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873504\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7225170583775588,\n \"acc_stderr\": 0.012333447581047546\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Helion-4x34B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|arc:challenge|25_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|gsm8k|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hellaswag|10_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T13-23-45.843719.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["**/details_harness|winogrande|5_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T13-23-45.843719.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T13_23_45.843719", "path": ["results_2024-01-14T13-23-45.843719.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T13-23-45.843719.parquet"]}]}]} | 2024-01-14T13:26:14+00:00 |
5cadd32c4483cb50dceb9aee929d20d86b1355b7 | HELLOLIULIU/Demo | [
"region:us"
] | 2024-01-14T13:27:52+00:00 | {} | 2024-01-22T12:52:01+00:00 |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.