Update README.md
Browse files
README.md
CHANGED
@@ -1,7 +1,6 @@
|
|
1 |
-
BioM-Transformers: Building Large Biomedical Language Models with
|
2 |
-
BERT, ALBERT and ELECTRA
|
3 |
|
4 |
-
Abstract
|
5 |
|
6 |
|
7 |
The impact of design choices on the performance
|
@@ -21,14 +20,17 @@ the significant effect of design choices on
|
|
21 |
improving the performance of biomedical language
|
22 |
models.
|
23 |
|
|
|
|
|
24 |
This model was pre-trained on PubMed Abstracts only with biomedical domain vocabulary for 500K steps with a batch size of 1024 on TPUv3-32 unit.
|
25 |
|
26 |
Check our GitHub repo at https://github.com/salrowili/BioM-Transformers for TensorFlow and GluonNLP checkpoints.
|
27 |
|
28 |
-
Acknowledgment
|
29 |
|
30 |
We would like to acknowledge the support we have from Tensorflow Research Cloud (TFRC) team to grant us access to TPUv3 units.
|
31 |
|
|
|
32 |
|
33 |
```bibtex
|
34 |
@inproceedings{alrowili-shanker-2021-biom,
|
|
|
1 |
+
# BioM-Transformers: Building Large Biomedical Language Models with BERT, ALBERT and ELECTRA
|
|
|
2 |
|
3 |
+
# Abstract
|
4 |
|
5 |
|
6 |
The impact of design choices on the performance
|
|
|
20 |
improving the performance of biomedical language
|
21 |
models.
|
22 |
|
23 |
+
# Model Description
|
24 |
+
|
25 |
This model was pre-trained on PubMed Abstracts only with biomedical domain vocabulary for 500K steps with a batch size of 1024 on TPUv3-32 unit.
|
26 |
|
27 |
Check our GitHub repo at https://github.com/salrowili/BioM-Transformers for TensorFlow and GluonNLP checkpoints.
|
28 |
|
29 |
+
# Acknowledgment
|
30 |
|
31 |
We would like to acknowledge the support we have from Tensorflow Research Cloud (TFRC) team to grant us access to TPUv3 units.
|
32 |
|
33 |
+
# Citation
|
34 |
|
35 |
```bibtex
|
36 |
@inproceedings{alrowili-shanker-2021-biom,
|