llange commited on
Commit
e0ed708
·
1 Parent(s): d61bc7e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -1,6 +1,6 @@
1
  # CLIN-X-ES: a pre-trained language model for the Spanish clinical domain
2
  Details on the model, the pre-training corpus and the downstream task performance are given in the paper: "CLIN-X: pre-trained language models and a study on cross-task transfer for concept extraction in the clinical domain" by Lukas Lange, Heike Adel, Jannik Strötgen and Dietrich Klakow.
3
- The paper can be found [here](https://github.com/boschresearch/clin_x).
4
  In case of questions, please contact the authors as listed on the paper.
5
 
6
  Please cite the above paper when reporting, reproducing or extending the results.
@@ -12,10 +12,10 @@ Please cite the above paper when reporting, reproducing or extending the results
12
  Dietrich Klakow},
13
  title = {CLIN-X: pre-trained language models and a study on cross-task transfer for concept extraction in the clinical domain},
14
  year={2021},
15
- eprint={},
16
  archivePrefix={arXiv},
17
  primaryClass={cs.CL},
18
- url={}
19
  }
20
 
21
  ## Training details
 
1
  # CLIN-X-ES: a pre-trained language model for the Spanish clinical domain
2
  Details on the model, the pre-training corpus and the downstream task performance are given in the paper: "CLIN-X: pre-trained language models and a study on cross-task transfer for concept extraction in the clinical domain" by Lukas Lange, Heike Adel, Jannik Strötgen and Dietrich Klakow.
3
+ The paper can be found [here](https://arxiv.org/abs/2112.08754).
4
  In case of questions, please contact the authors as listed on the paper.
5
 
6
  Please cite the above paper when reporting, reproducing or extending the results.
 
12
  Dietrich Klakow},
13
  title = {CLIN-X: pre-trained language models and a study on cross-task transfer for concept extraction in the clinical domain},
14
  year={2021},
15
+ eprint={2112.08754},
16
  archivePrefix={arXiv},
17
  primaryClass={cs.CL},
18
+ url={https://arxiv.org/abs/2112.08754}
19
  }
20
 
21
  ## Training details