Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,9 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
datasets:
|
3 |
+
- abertsch/booksum-fullbooks
|
4 |
+
pipeline_tag: text2text-generation
|
5 |
+
---
|
6 |
+
|
7 |
+
Model from the preprint [Unlimiformer: Long-Range Transformers with Unlimited Length Input](https://arxiv.org/abs/2305.01625).
|
8 |
+
|
9 |
+
This model was finetuned from a BART-base model using the retrieval-augmented training strategy described in section 3.2. It was finetuned on the dataset BookSum (full-book setting).
|