Spaces:
Running
Running
Update README.md
Browse files
README.md
CHANGED
@@ -42,7 +42,7 @@ Here we host public weights for our biomedical language models. There are severa
|
|
42 |
|
43 |
| Model | Domain | Type | Details |
|
44 |
|------------|---------|-------------------|-------------------------------------------------------------|
|
45 |
-
| [Igea](https://huggingface.co/
|
46 |
| [BioBIT](https://huggingface.co/bmi-labmedinfo/bioBIT) <sup>*</sup>| Biomedical | MaskedLM Pretrain | BERT model trained after [dbmdz/bert-base-italian-xxl-cased](https://huggingface.co/dbmdz/bert-base-italian-xxl-cased) with 28GB Pubmed abstracts (as in BioBERT) that have been translated from English into Italian using Neural Machine Translation (GNMT). |
|
47 |
| [MedBIT](https://huggingface.co/bmi-labmedinfo/medBIT) <sup>*</sup>| Medical | MaskedLM Pretrain | BERT model trained after [BioBIT](https://huggingface.co/bmi-labmedinfo/bioBIT) with additional 100MB of medical textbook data without any regularization. |
|
48 |
| [MedBIT-R3+](https://huggingface.co/bmi-labmedinfo/medBIT-r3-plus) (recommended) <sup>*</sup>| Medical | MaskedLM Pretrain | BERT model trained after [BioBIT](https://huggingface.co/bmi-labmedinfo/bioBIT) with additional 200MB of medical textbook data and web-crawled medical resources in Italian. Regularized with LLRD (.95), Mixout (.9), and Warmup (.02). |
|
|
|
42 |
|
43 |
| Model | Domain | Type | Details |
|
44 |
|------------|---------|-------------------|-------------------------------------------------------------|
|
45 |
+
| [Igea](https://huggingface.co/bmi-labmedinfo/Igea-7B-v0.1) | Biomedical | CausalLM Pretrain | Small language model trained after [sapienzanlp/Minerva](https://huggingface.co/sapienzanlp/Minerva-1B-base-v1.0) with more than 5 billion biomedical words in Italian. Three versions available: [350M params](https://huggingface.co/bmi-labmedinfo/Igea-350M-v0.1), [1B params](https://huggingface.co/bmi-labmedinfo/Igea-1B-v0.1), [3B params](https://huggingface.co/bmi-labmedinfo/Igea-3B-v0.1), and [7B params](https://huggingface.co/bmi-labmedinfo/Igea-7B-v0.1). |
|
46 |
| [BioBIT](https://huggingface.co/bmi-labmedinfo/bioBIT) <sup>*</sup>| Biomedical | MaskedLM Pretrain | BERT model trained after [dbmdz/bert-base-italian-xxl-cased](https://huggingface.co/dbmdz/bert-base-italian-xxl-cased) with 28GB Pubmed abstracts (as in BioBERT) that have been translated from English into Italian using Neural Machine Translation (GNMT). |
|
47 |
| [MedBIT](https://huggingface.co/bmi-labmedinfo/medBIT) <sup>*</sup>| Medical | MaskedLM Pretrain | BERT model trained after [BioBIT](https://huggingface.co/bmi-labmedinfo/bioBIT) with additional 100MB of medical textbook data without any regularization. |
|
48 |
| [MedBIT-R3+](https://huggingface.co/bmi-labmedinfo/medBIT-r3-plus) (recommended) <sup>*</sup>| Medical | MaskedLM Pretrain | BERT model trained after [BioBIT](https://huggingface.co/bmi-labmedinfo/bioBIT) with additional 200MB of medical textbook data and web-crawled medical resources in Italian. Regularized with LLRD (.95), Mixout (.9), and Warmup (.02). |
|