mrm8488's picture
Update README.md
38c75a8
|
raw
history blame
1.3 kB
metadata
language:
  - es
license: mit
widget:
  - text: >-
      Manuel Romero ha creado con el equipo de BERTIN un modelo que procesa
      documentos <mask> largos.
tags:
  - Long documents
  - longformer
  - bertin
  - spanish
datasets:
  - spanish_large_corpus

longformer-base-4096-spanish

Longformer is a Transformer model for long documents.

longformer-base-4096 is a BERT-like model started from the RoBERTa checkpoint (BERTIN in this case) and pre-trained for MLM on long documents (from BETO's all_wikis). It supports sequences of length up to 4,096!

Longformer uses a combination of a sliding window (local) attention and global attention. Global attention is user-configured based on the task to allow the model to learn task-specific representations.

This model was made following the research done by Iz Beltagy and Matthew E. Peters and Arman Cohan.

Citation

If you want to cite this model you can use this:

@misc{mromero2022longformer-base-4096-spanish,
  title={Spanish LongFormer by Manuel Romero},
  author={Romero, Manuel},
  publisher={Hugging Face},
  journal={Hugging Face Hub},
  howpublished={\url{https://huggingface.co/mrm8488/longformer-base-4096-spanish}},
  year={2022}
}