|
--- |
|
license: cc-by-nc-4.0 |
|
datasets: |
|
- stockmark/ner-wikipedia-dataset |
|
language: |
|
- ja |
|
metrics: |
|
- f1 |
|
- precision |
|
- recall |
|
tags: |
|
- NER |
|
- information extraction |
|
- relation extraction |
|
- summarization |
|
- sentiment extraction |
|
- question-answering |
|
pipeline_tag: token-classification |
|
library_name: gliner |
|
--- |
|
|
|
# vumichien/ner-jp-gliner |
|
|
|
This model is a fine-tuned version of [deberta-v3-base-japanese](ku-nlp/deberta-v3-base-japanese) on the Japanese Ner Wikipedia dataset. |
|
It achieves the following results: |
|
- Precision: 96.07% |
|
- Recall: 89.16% |
|
- F1 score: 92.49% |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
The following hyperparameters were used during training: |
|
- num_steps: 30000 |
|
- train_batch_size: 8 |
|
- eval_every: 3000 |
|
- warmup_ratio: 0.1 |
|
- scheduler_type: "cosine" |
|
- loss_alpha: -1 |
|
- loss_gamma: 0 |
|
- label_smoothing: 0 |
|
- loss_reduction: "sum" |
|
- lr_encoder: 1e-5 |
|
- lr_others: 5e-5 |
|
- weight_decay_encoder: 0.01 |
|
- weight_decay_other: 0.01 |
|
|
|
### Training results |
|
|
|
| Epoch | Training Loss | |
|
|:-----:|:-------------:| |
|
| 1 | 1291.582200 | |
|
| 2 | 53.290100 | |
|
| 3 | 44.137400 | |
|
| 4 | 35.286200 | |
|
| 5 | 20.865500 | |
|
| 6 | 15.890000 | |
|
| 7 | 13.369600 | |
|
| 8 | 11.599500 | |
|
| 9 | 9.773400 | |
|
| 10 | 8.372600 | |
|
| 11 | 7.256200 | |
|
| 12 | 6.521800 | |
|
| 13 | 7.203800 | |
|
| 14 | 7.032900 | |
|
| 15 | 6.189700 | |
|
| 16 | 6.897400 | |
|
| 17 | 6.031700 | |
|
| 18 | 5.329600 | |
|
| 19 | 5.411300 | |
|
| 20 | 5.253800 | |
|
| 21 | 4.522000 | |
|
| 22 | 5.107700 | |
|
| 23 | 4.163300 | |
|
| 24 | 4.185400 | |
|
| 25 | 3.403100 | |
|
| 26 | 3.272400 | |
|
| 27 | 2.387800 | |
|
| 28 | 3.039400 | |
|
| 29 | 2.383000 | |
|
| 30 | 1.895300 | |
|
| 31 | 1.748700 | |
|
| 32 | 1.864300 | |
|
| 33 | 2.343000 | |
|
| 34 | 1.356600 | |
|
| 35 | 1.182000 | |
|
| 36 | 0.894700 | |
|
| 37 | 0.954900 | |
|
|