Daniel O'Connell
commited on
Commit
·
edcafa3
1
Parent(s):
310a150
remove arxiv_papers
Browse files- README.md +7 -1
- alignment-research-dataset.py +3 -1
- arxiv_papers.jsonl +0 -3
README.md
CHANGED
@@ -1039,8 +1039,14 @@ LessWrong posts have overweighted content on doom and existential risk, so pleas
|
|
1039 |
|
1040 |
The scraper to generate this dataset is open-sourced on [GitHub](https://github.com/StampyAI/alignment-research-dataset) and currently maintained by volunteers at StampyAI / AI Safety Info. [Learn more](https://coda.io/d/AI-Safety-Info_dfau7sl2hmG/Get-involved_susRF#_lufSr) or join us on [Discord](https://discord.gg/vjFSCDyMCy).
|
1041 |
|
|
|
|
|
|
|
|
|
|
|
|
|
1042 |
## Citing the Dataset
|
1043 |
|
1044 |
For more information, here is the [paper](https://arxiv.org/abs/2206.02841) and [LessWrong](https://www.lesswrong.com/posts/FgjcHiWvADgsocE34/a-descriptive-not-prescriptive-overview-of-current-ai) post. Please use the following citation when using the dataset:
|
1045 |
|
1046 |
-
Kirchner, J. H., Smith, L., Thibodeau, J., McDonnell, K., and Reynolds, L. "Understanding AI alignment research: A Systematic Analysis." arXiv preprint arXiv:2022.4338861 (2022).
|
|
|
1039 |
|
1040 |
The scraper to generate this dataset is open-sourced on [GitHub](https://github.com/StampyAI/alignment-research-dataset) and currently maintained by volunteers at StampyAI / AI Safety Info. [Learn more](https://coda.io/d/AI-Safety-Info_dfau7sl2hmG/Get-involved_susRF#_lufSr) or join us on [Discord](https://discord.gg/vjFSCDyMCy).
|
1041 |
|
1042 |
+
## Rebuilding info
|
1043 |
+
|
1044 |
+
This README contains info about the number of rows and their features which should be rebuilt each time datasets get changed. To do so, run:
|
1045 |
+
|
1046 |
+
datasets-cli test ./alignment-research-dataset --save_info --all_configs
|
1047 |
+
|
1048 |
## Citing the Dataset
|
1049 |
|
1050 |
For more information, here is the [paper](https://arxiv.org/abs/2206.02841) and [LessWrong](https://www.lesswrong.com/posts/FgjcHiWvADgsocE34/a-descriptive-not-prescriptive-overview-of-current-ai) post. Please use the following citation when using the dataset:
|
1051 |
|
1052 |
+
Kirchner, J. H., Smith, L., Thibodeau, J., McDonnell, K., and Reynolds, L. "Understanding AI alignment research: A Systematic Analysis." arXiv preprint arXiv:2022.4338861 (2022).
|
alignment-research-dataset.py
CHANGED
@@ -125,6 +125,7 @@ DATASOURCES = {
|
|
125 |
'tags': Sequence(feature=Value(dtype='string')),
|
126 |
'modified_at': Value(dtype='string'),
|
127 |
'source_type': Value(dtype='string'),
|
|
|
128 |
},
|
129 |
'alignment_newsletter': {
|
130 |
'converted_with': Value(dtype='string'),
|
@@ -174,7 +175,8 @@ DATASOURCES = {
|
|
174 |
'abstract': Value(dtype='string'),
|
175 |
'journal_ref': Value(dtype='string'),
|
176 |
'doi': Value(dtype='string'),
|
177 |
-
'bibliography_bib': Sequence(feature={'title': Value(dtype='string')}, length=-1)
|
|
|
178 |
},
|
179 |
'eaforum': {
|
180 |
'karma': Value(dtype='int32'),
|
|
|
125 |
'tags': Sequence(feature=Value(dtype='string')),
|
126 |
'modified_at': Value(dtype='string'),
|
127 |
'source_type': Value(dtype='string'),
|
128 |
+
'summary': Sequence(feature=Value(dtype='string')),
|
129 |
},
|
130 |
'alignment_newsletter': {
|
131 |
'converted_with': Value(dtype='string'),
|
|
|
175 |
'abstract': Value(dtype='string'),
|
176 |
'journal_ref': Value(dtype='string'),
|
177 |
'doi': Value(dtype='string'),
|
178 |
+
'bibliography_bib': Sequence(feature={'title': Value(dtype='string')}, length=-1),
|
179 |
+
'summary': Sequence(feature=Value(dtype='string')),
|
180 |
},
|
181 |
'eaforum': {
|
182 |
'karma': Value(dtype='int32'),
|
arxiv_papers.jsonl
DELETED
@@ -1,3 +0,0 @@
|
|
1 |
-
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:e229f3dde315efe606975989edae496cbf1b7da2d27cc8ea6d86c5a00e9141b0
|
3 |
-
size 155468988
|
|
|
|
|
|
|
|