scheiblr nielsr HF staff commited on
Commit
ef1c549
·
verified ·
1 Parent(s): 9bdae32

Add link to paper (#3)

Browse files

- Add link to paper (fc58fecf19e9ff8279fbd446c60c0284f55f0541)


Co-authored-by: Niels Rogge <[email protected]>

Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -17,7 +17,7 @@ GottBERT is the first German-only RoBERTa model, pre-trained on the German porti
17
  - **Large Model**: 24 layers, 355 million parameters
18
  - **License**: MIT
19
 
20
- ---
21
 
22
  ## Pretraining Details
23
 
 
17
  - **Large Model**: 24 layers, 355 million parameters
18
  - **License**: MIT
19
 
20
+ This was presented in [GottBERT: a pure German Language Model](https://huggingface.co/papers/2012.02110).
21
 
22
  ## Pretraining Details
23