Update README.md
Browse files
README.md
CHANGED
@@ -8,6 +8,20 @@ datasets:
|
|
8 |
|
9 |
## BERT base-uncased for in Swahili
|
10 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
11 |
#### **Training Data**:
|
12 |
This model was trained on [Swahili Safi](https://huggingface.co/datasets/flax-community/swahili-safi)
|
13 |
|
|
|
8 |
|
9 |
## BERT base-uncased for in Swahili
|
10 |
|
11 |
+
## GPT2 in Swahili
|
12 |
+
|
13 |
+
This model was trained using HuggingFace's Flax framework and is part of the [JAX/Flax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104) organized by [HuggingFace](https://huggingface.co). All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
|
14 |
+
|
15 |
+
## How to use
|
16 |
+
|
17 |
+
```python
|
18 |
+
from transformers import AutoTokenizer, AutoModelForMaskedLM
|
19 |
+
|
20 |
+
tokenizer = AutoTokenizer.from_pretrained("flax-community/bert-base-uncased-swahili")
|
21 |
+
|
22 |
+
model = AutoModelForMaskedLM.from_pretrained("flax-community/bert-base-uncased-swahili")
|
23 |
+
```
|
24 |
+
|
25 |
#### **Training Data**:
|
26 |
This model was trained on [Swahili Safi](https://huggingface.co/datasets/flax-community/swahili-safi)
|
27 |
|