Update README.md
Browse files
README.md
CHANGED
@@ -15,7 +15,7 @@ license: "apache-2.0"
|
|
15 |
---
|
16 |
# SikuBERT
|
17 |
## Model description
|
18 |
-

|
19 |
Digital humanities research needs the support of large-scale corpus and high-performance ancient Chinese natural language processing tools. The pre-training language model has greatly improved the accuracy of text mining in English and modern Chinese texts. At present, there is an urgent need for a pre-training model specifically for the automatic processing of ancient texts. We used the verified high-quality “Siku Quanshu” full-text corpus as the training set, based on the BERT deep language model architecture, we constructed the SikuBERT and SikuRoBERTa pre-training language models for intelligent processing tasks of ancient Chinese.
|
20 |
## How to use
|
21 |
```python
|
@@ -25,4 +25,4 @@ model = AutoModel.from_pretrained("SIKU-BERT/sikuroberta")
|
|
25 |
```
|
26 |
## About Us
|
27 |
We are from Nanjing Agricultural University.
|
28 |
-
> Created with by SIKU-BERT [](https://github.com/SIKU-BERT)
|
|
|
15 |
---
|
16 |
# SikuBERT
|
17 |
## Model description
|
18 |
+

|
19 |
Digital humanities research needs the support of large-scale corpus and high-performance ancient Chinese natural language processing tools. The pre-training language model has greatly improved the accuracy of text mining in English and modern Chinese texts. At present, there is an urgent need for a pre-training model specifically for the automatic processing of ancient texts. We used the verified high-quality “Siku Quanshu” full-text corpus as the training set, based on the BERT deep language model architecture, we constructed the SikuBERT and SikuRoBERTa pre-training language models for intelligent processing tasks of ancient Chinese.
|
20 |
## How to use
|
21 |
```python
|
|
|
25 |
```
|
26 |
## About Us
|
27 |
We are from Nanjing Agricultural University.
|
28 |
+
> Created with by SIKU-BERT [](https://github.com/SIKU-BERT/SikuBERT-for-digital-humanities-and-classical-Chinese-information-processing/blob/main/appendix/sikubert.png)
|