Spaces:
Running
Running
Update README.md
Browse files
README.md
CHANGED
@@ -7,6 +7,15 @@ sdk: static
|
|
7 |
pinned: false
|
8 |
---
|
9 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
10 |
# BMI - Biomedical Informatics Lab "Mario Stefanelli"
|
11 |
|
12 |
## About Us
|
@@ -31,11 +40,12 @@ Here we host public weights for our biomedical language models. There are severa
|
|
31 |
|
32 |
| Model | Domain | Type | Details |
|
33 |
|------------|---------|-------------------|-------------------------------------------------------------|
|
34 |
-
| [
|
35 |
-
| [
|
36 |
-
| [MedBIT
|
|
|
37 |
|
38 |
-
<sup
|
39 |
|
40 |
|
41 |
Other models coming soon!
|
|
|
7 |
pinned: false
|
8 |
---
|
9 |
|
10 |
+
<p align="center">
|
11 |
+
<img src="bmilogo.png" />
|
12 |
+
</p>
|
13 |
+
|
14 |
+
<p align="center">
|
15 |
+
<img src="bmi_scritta.png" />
|
16 |
+
</p>
|
17 |
+
|
18 |
+
|
19 |
# BMI - Biomedical Informatics Lab "Mario Stefanelli"
|
20 |
|
21 |
## About Us
|
|
|
40 |
|
41 |
| Model | Domain | Type | Details |
|
42 |
|------------|---------|-------------------|-------------------------------------------------------------|
|
43 |
+
| [Igea](https://huggingface.co/Detsutut/Igea-1B-v0.0.1) | Biomedical | CausalLM Pretrain | Small language model trained after [sapienzanlp/Minerva-1B-base-v1.0](https://huggingface.co/sapienzanlp/Minerva-1B-base-v1.0) with 28GB Pubmed abstracts (as in BioBERT) that have been translated from English into Italian using Neural Machine Translation (GNMT). |
|
44 |
+
| [BioBIT](https://huggingface.co/bmi-labmedinfo/bioBIT) <sup>*</sup>| Biomedical | MaskedLM Pretrain | BERT model trained after [dbmdz/bert-base-italian-xxl-cased](https://huggingface.co/dbmdz/bert-base-italian-xxl-cased) with 28GB Pubmed abstracts (as in BioBERT) that have been translated from English into Italian using Neural Machine Translation (GNMT). |
|
45 |
+
| [MedBIT](https://huggingface.co/bmi-labmedinfo/medBIT) <sup>*</sup>| Medical | MaskedLM Pretrain | BERT model trained after [BioBIT](https://huggingface.co/bmi-labmedinfo/bioBIT) with additional 100MB of medical textbook data without any regularization. |
|
46 |
+
| [MedBIT-R3+](https://huggingface.co/bmi-labmedinfo/medBIT-r3-plus) (recommended) <sup>*</sup>| Medical | MaskedLM Pretrain | BERT model trained after [BioBIT](https://huggingface.co/bmi-labmedinfo/bioBIT) with additional 200MB of medical textbook data and web-crawled medical resources in Italian. Regularized with LLRD (.95), Mixout (.9), and Warmup (.02). |
|
47 |
|
48 |
+
<sup>*</sup> <small>model developed for the [Italian Neuroscience and Rehabilitation Network](https://www.reteneuroscienze.it/en/istituti-nazionali-virtuali/) in partnership with the Neuroinformatics Lab of IRCCS Centro San Giovanni di Dio Fatebenefratelli, Brescia, Italy</small>
|
49 |
|
50 |
|
51 |
Other models coming soon!
|