Narrativa mrm8488 commited on
Commit
0c4fb7a
1 Parent(s): 90cd510

Update README.md (#1)

Browse files

- Update README.md (5ffd9b4167f4b2814ca3edf689413da1aa1ba89b)


Co-authored-by: Manuel Romero <[email protected]>

Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -23,7 +23,7 @@ tags:
23
 
24
  # Legal ⚖️ longformer-base-4096-spanish
25
 
26
- `legal-longformer-base-4096` is a BERT-like model started from the RoBERTa checkpoint (**[RoBERTalex](PlanTL-GOB-ES/RoBERTalex)** in this case) and pre-trained for *MLM* on long documents from the [Spanish Legal Domain Corpora](https://zenodo.org/record/5495529/#.Y205lpHMKV5). It supports sequences of length up to **4,096**!
27
 
28
  **Longformer** uses a combination of a sliding window (*local*) attention and *global* attention. Global attention is user-configured based on the task to allow the model to learn task-specific representations.
29
 
 
23
 
24
  # Legal ⚖️ longformer-base-4096-spanish
25
 
26
+ `legal-longformer-base-4096` is a BERT-like model started from the RoBERTa checkpoint (**[RoBERTalex](https://huggingface.co/PlanTL-GOB-ES/RoBERTalex)** in this case) and pre-trained for *MLM* on long documents from the [Spanish Legal Domain Corpora](https://zenodo.org/record/5495529/#.Y205lpHMKV5). It supports sequences of length up to **4,096**!
27
 
28
  **Longformer** uses a combination of a sliding window (*local*) attention and *global* attention. Global attention is user-configured based on the task to allow the model to learn task-specific representations.
29