test
#19
by
chin1002
- opened
README.md
CHANGED
@@ -1,6 +1,5 @@
|
|
1 |
---
|
2 |
language: zh
|
3 |
-
license: apache-2.0
|
4 |
---
|
5 |
|
6 |
# Bert-base-chinese
|
@@ -20,17 +19,15 @@ license: apache-2.0
|
|
20 |
|
21 |
This model has been pre-trained for Chinese, training and random input masking has been applied independently to word pieces (as in the original BERT paper).
|
22 |
|
23 |
-
- **Developed by:**
|
24 |
- **Model Type:** Fill-Mask
|
25 |
- **Language(s):** Chinese
|
26 |
-
- **License:**
|
27 |
- **Parent Model:** See the [BERT base uncased model](https://huggingface.co/bert-base-uncased) for more information about the BERT base model.
|
28 |
|
29 |
### Model Sources
|
30 |
-
- **GitHub repo**: https://github.com/google-research/bert/blob/master/multilingual.md
|
31 |
- **Paper:** [BERT](https://arxiv.org/abs/1810.04805)
|
32 |
|
33 |
-
|
34 |
## Uses
|
35 |
|
36 |
#### Direct Use
|
@@ -70,4 +67,9 @@ tokenizer = AutoTokenizer.from_pretrained("bert-base-chinese")
|
|
70 |
|
71 |
model = AutoModelForMaskedLM.from_pretrained("bert-base-chinese")
|
72 |
|
73 |
-
```
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
language: zh
|
|
|
3 |
---
|
4 |
|
5 |
# Bert-base-chinese
|
|
|
19 |
|
20 |
This model has been pre-trained for Chinese, training and random input masking has been applied independently to word pieces (as in the original BERT paper).
|
21 |
|
22 |
+
- **Developed by:** HuggingFace team
|
23 |
- **Model Type:** Fill-Mask
|
24 |
- **Language(s):** Chinese
|
25 |
+
- **License:** [More Information needed]
|
26 |
- **Parent Model:** See the [BERT base uncased model](https://huggingface.co/bert-base-uncased) for more information about the BERT base model.
|
27 |
|
28 |
### Model Sources
|
|
|
29 |
- **Paper:** [BERT](https://arxiv.org/abs/1810.04805)
|
30 |
|
|
|
31 |
## Uses
|
32 |
|
33 |
#### Direct Use
|
|
|
67 |
|
68 |
model = AutoModelForMaskedLM.from_pretrained("bert-base-chinese")
|
69 |
|
70 |
+
```
|
71 |
+
|
72 |
+
|
73 |
+
|
74 |
+
|
75 |
+
|