Baseline model for the preprint Unlimiformer: Long-Range Transformers with Unlimited Length Input.
This model was finetuned from a BART-base model as a baseline. It was finetuned on the dataset SummScreen using the data preprocessing pipeline from SLED; to load the validation or test set for use with these model, please use the datasets urialon/summ_screen_validation and urialon/summ_screen_test.
- Downloads last month
- 7
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.