bert-base-spanish-wwm-cased-caresA
This model is a finetuned version of bert-base-spanish-wwm-cased for the cantemist dataset used in a benchmark in the paper A comparative analysis of Spanish Clinical encoder-based models on NER and classification tasks
. The model has a F1 of 0.992
Please refer to the original publication for more information.
Parameters used
parameter |
Value |
batch size |
32 |
learning rate |
4e-05 |
classifier dropout |
0.2 |
warmup ratio |
0 |
warmup steps |
0 |
weight decay |
0 |
optimizer |
AdamW |
epochs |
10 |
early stopping patience |
3 |
BibTeX entry and citation info
@article{10.1093/jamia/ocae054,
author = {García Subies, Guillem and Barbero Jiménez, Álvaro and Martínez Fernández, Paloma},
title = {A comparative analysis of Spanish Clinical encoder-based models on NER and classification tasks},
journal = {Journal of the American Medical Informatics Association},
volume = {31},
number = {9},
pages = {2137-2146},
year = {2024},
month = {03},
issn = {1527-974X},
doi = {10.1093/jamia/ocae054},
url = {https://doi.org/10.1093/jamia/ocae054},
}