File size: 1,047 Bytes
be3d9a5 79dded4 be3d9a5 79dded4 be3d9a5 79dded4 be3d9a5 79dded4 be3d9a5 79dded4 be3d9a5 17a4fcb |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 |
---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: gold_label
dtype: string
- name: config
dtype: string
splits:
- name: train
num_bytes: 10931716
num_examples: 48000
- name: test
num_bytes: 1834340
num_examples: 8000
- name: dev
num_bytes: 1817170
num_examples: 8000
download_size: 3883767
dataset_size: 14583226
---
# Dataset Card for "semantic_fragments_nli"
https://github.com/yakazimir/semantic_fragments
https://arxiv.org/pdf/1909.07521.pdf
```bib
@article{Richardson2019ProbingNL,
title={Probing Natural Language Inference Models through Semantic Fragments},
author={Kyle Richardson and Hai Hu and Lawrence S. Moss and Ashish Sabharwal},
journal={ArXiv},
year={2019},
volume={abs/1909.07521},
url={https://api.semanticscholar.org/CorpusID:202583828}
}
``` |