sultan commited on
Commit
bb474f2
1 Parent(s): 0b5f08c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -5
README.md CHANGED
@@ -1,8 +1,6 @@
1
- BioM-Transformers: Building Large Biomedical Language Models with
2
- BERT, ALBERT and ELECTRA
3
-
4
- Abstract
5
 
 
6
 
7
  The impact of design choices on the performance
8
  of biomedical language models recently
@@ -21,15 +19,18 @@ the significant effect of design choices on
21
  improving the performance of biomedical language
22
  models.
23
 
 
 
24
  This model was pre-trained on PubMed Abstracts only with biomedical domain vocabulary for 434K steps with a batch size of 4096 on TPUv3-512 unit.
25
 
26
  Check our GitHub repo at https://github.com/salrowili/BioM-Transformers for TensorFlow and GluonNLP checkpoints.
27
 
28
- Acknowledgment
29
 
30
  We would like to acknowledge the support we have from Tensorflow Research Cloud (TFRC) team to grant us access to TPUv3 units.
31
 
32
 
 
33
 
34
  ```bibtex
35
  @inproceedings{alrowili-shanker-2021-biom,
 
1
+ # BioM-Transformers: Building Large Biomedical Language Models with BERT, ALBERT and ELECTRA
 
 
 
2
 
3
+ # Abstract
4
 
5
  The impact of design choices on the performance
6
  of biomedical language models recently
 
19
  improving the performance of biomedical language
20
  models.
21
 
22
+ # Model Description
23
+
24
  This model was pre-trained on PubMed Abstracts only with biomedical domain vocabulary for 434K steps with a batch size of 4096 on TPUv3-512 unit.
25
 
26
  Check our GitHub repo at https://github.com/salrowili/BioM-Transformers for TensorFlow and GluonNLP checkpoints.
27
 
28
+ # Acknowledgment
29
 
30
  We would like to acknowledge the support we have from Tensorflow Research Cloud (TFRC) team to grant us access to TPUv3 units.
31
 
32
 
33
+ # Citation
34
 
35
  ```bibtex
36
  @inproceedings{alrowili-shanker-2021-biom,