mrm8488 commited on
Commit
406e3fa
·
1 Parent(s): d258c9e

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +59 -0
README.md ADDED
@@ -0,0 +1,59 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: es
3
+ tags:
4
+ - Spanish
5
+ - Electra
6
+ - Bio
7
+ - Medical
8
+
9
+ datasets:
10
+ - cowese
11
+ ---
12
+
13
+ ## BIOMEDTRA 🦠🏥
14
+
15
+ **BIOMEDTRA** (small) is an Electra like model (discriminator in this case) trained on [Spanish Biomedical Crawled Corpus](https://zenodo.org/record/5510033#.Yhdk1ZHMLJx).
16
+
17
+ As mentioned in the original [paper](https://openreview.net/pdf?id=r1xMH1BtvB):
18
+ **ELECTRA** is a new method for self-supervised language representation learning. It can be used to pre-train transformer networks using relatively little compute. ELECTRA models are trained to distinguish "real" input tokens vs "fake" input tokens generated by another neural network, similar to the discriminator of a [GAN](https://arxiv.org/pdf/1406.2661.pdf). At small scale, ELECTRA achieves strong results even when trained on a single GPU. At large scale, ELECTRA achieves state-of-the-art results on the [SQuAD 2.0](https://rajpurkar.github.io/SQuAD-explorer/) dataset.
19
+
20
+ For a detailed description and experimental results, please refer the paper [ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators](https://openreview.net/pdf?id=r1xMH1BtvB).
21
+
22
+
23
+ ## Training details
24
+
25
+ The model was trained using the Electra base code for 3 days on 1 Tesla V100 16GB.
26
+
27
+ ## Model details ⚙
28
+
29
+ |Param| # Value|
30
+ |-----|--------|
31
+ |Layers| 12 |
32
+ |Hidden | 256 |
33
+ |Params| 14M |
34
+
35
+ ## Evaluation metrics (for discriminator) 🧾
36
+
37
+ |Metric | # Score |
38
+ |-------|---------|
39
+ |Accuracy| 0.955|
40
+ |Precision| 0.790|
41
+ |AUC | 0.971|
42
+
43
+
44
+ ## Benchmarks 🔨
45
+
46
+ WIP 🚧
47
+
48
+ ## How to use the discriminator in `transformers`
49
+
50
+ TBA
51
+
52
+ ## Acknowledgments
53
+
54
+ TBA
55
+
56
+
57
+ > Created by [Manuel Romero/@mrm8488](https://twitter.com/mrm8488)
58
+
59
+ > Made with <span style="color: #e25555;">&hearts;</span> in Spain