Edit model card

longformer-base-4096-spanish

Longformer

longformer-base-4096 is a BERT-like model started from the RoBERTa checkpoint (BERTIN in this case) and pre-trained for MLM on long documents (from BETO's all_wikis). It supports sequences of length up to 4,096!

Longformer uses a combination of a sliding window (local) attention and global attention. Global attention is user-configured based on the task to allow the model to learn task-specific representations.

This model was made following the research done by Iz Beltagy and Matthew E. Peters and Arman Cohan.

Citation

If you want to cite this model you can use this:

@misc{mromero2022longformer-base-4096-spanish,
  title={Spanish LongFormer by Manuel Romero},
  author={Romero, Manuel},
  publisher={Hugging Face},
  journal={Hugging Face Hub},
  howpublished={\url{https://huggingface.co/mrm8488/longformer-base-4096-spanish}},
  year={2022}
}
Downloads last month
204
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Space using mrm8488/longformer-base-4096-spanish 1