Update README.md
Browse files
README.md
CHANGED
@@ -7,7 +7,7 @@ datasets:
|
|
7 |
---
|
8 |
# Model Card of instructionMBERTv1 for Bertology
|
9 |
|
10 |
-
A minimalistic instruction model with an already good analysed and pretrained encoder like
|
11 |
So we can research the [Bertology](https://aclanthology.org/2020.tacl-1.54.pdf) with instruction-tuned models, [look at the attention](https://colab.research.google.com/drive/1mNP7c0RzABnoUgE6isq8FTp-NuYNtrcH?usp=sharing) and investigate [what happens to BERT embeddings during fine-tuning](https://aclanthology.org/2020.blackboxnlp-1.4.pdf).
|
12 |
|
13 |
The training code is released at the [instructionBERT repository](https://gitlab.com/Bachstelze/instructionbert).
|
|
|
7 |
---
|
8 |
# Model Card of instructionMBERTv1 for Bertology
|
9 |
|
10 |
+
A minimalistic multilingual instruction model with an already good analysed and pretrained encoder like mBERT.
|
11 |
So we can research the [Bertology](https://aclanthology.org/2020.tacl-1.54.pdf) with instruction-tuned models, [look at the attention](https://colab.research.google.com/drive/1mNP7c0RzABnoUgE6isq8FTp-NuYNtrcH?usp=sharing) and investigate [what happens to BERT embeddings during fine-tuning](https://aclanthology.org/2020.blackboxnlp-1.4.pdf).
|
12 |
|
13 |
The training code is released at the [instructionBERT repository](https://gitlab.com/Bachstelze/instructionbert).
|