Update README.md
Browse files
README.md
CHANGED
@@ -14,6 +14,8 @@ tags:
|
|
14 |
- madgrad
|
15 |
pipeline_tag: text-classification
|
16 |
library_name: transformers
|
|
|
|
|
17 |
---
|
18 |
|
19 |
# ChemFIE-SA (ChemSELFIES - Synthesis Accessibility)
|
@@ -165,11 +167,11 @@ The model (currently only trained on the 1st chunk) was evaluated using four tes
|
|
165 |
|
166 |
A comprehensive set of metrics employed to evaluate the model's performance:
|
167 |
|
168 |
-
1. **Accuracy (ACC)
|
169 |
-
2. **Recall
|
170 |
-
3. **Precision
|
171 |
-
4. **F1-score
|
172 |
-
5. **Area Under the Receiver Operating Characteristic curve (AUROC)
|
173 |
|
174 |
All metrics were evaluated using a threshold of 0.50 for binary classification.
|
175 |
|
@@ -190,7 +192,7 @@ Comparison data is sourced from Wang et al. (2023), used various models as encod
|
|
190 |
|
191 |
which was trained/fine-tuned to predict based on SMILES - while ChemFIE-SA is SELFIES-based:
|
192 |
|
193 |
-
| **Model** | **Recall** | **Precision** |
|
194 |
| -------------------- | :--------: | :-----------: | :---------: | :-------: |
|
195 |
| DeepSA_DeBERTa | 0.873 | 0.920 | 0.896 | 0.959 |
|
196 |
| DeepSA_GraphCodeBert | 0.931 | 0.944 | 0.937 | 0.987 |
|
|
|
14 |
- madgrad
|
15 |
pipeline_tag: text-classification
|
16 |
library_name: transformers
|
17 |
+
base_model: gbyuvd/chemselfies-base-bertmlm
|
18 |
+
base_model_relation: finetune
|
19 |
---
|
20 |
|
21 |
# ChemFIE-SA (ChemSELFIES - Synthesis Accessibility)
|
|
|
167 |
|
168 |
A comprehensive set of metrics employed to evaluate the model's performance:
|
169 |
|
170 |
+
1. **Accuracy (ACC)**
|
171 |
+
2. **Recall**
|
172 |
+
3. **Precision**
|
173 |
+
4. **F1-score**
|
174 |
+
5. **Area Under the Receiver Operating Characteristic curve (AUROC)**
|
175 |
|
176 |
All metrics were evaluated using a threshold of 0.50 for binary classification.
|
177 |
|
|
|
192 |
|
193 |
which was trained/fine-tuned to predict based on SMILES - while ChemFIE-SA is SELFIES-based:
|
194 |
|
195 |
+
| **Model** | **Recall** | **Precision** | **F1** | **AUROC** |
|
196 |
| -------------------- | :--------: | :-----------: | :---------: | :-------: |
|
197 |
| DeepSA_DeBERTa | 0.873 | 0.920 | 0.896 | 0.959 |
|
198 |
| DeepSA_GraphCodeBert | 0.931 | 0.944 | 0.937 | 0.987 |
|