update model card meta information
Browse files
README.md
CHANGED
@@ -1,6 +1,11 @@
|
|
1 |
-
---
|
2 |
-
license: apache-2.0
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
4 |
# RakutenAI-7B
|
5 |
## Model Description
|
6 |
RakutenAI-7B is a systematic initiative that brings the latest technologies to the world of Japanese LLMs. RakutenAI-7B achieves the best scores on the Japanese language understanding benchmarks while maintaining a competitive performance on the English test sets among similar models such as OpenCalm, Elyza, Youri, Nekomata and Swallow. RakutenAI-7B leverages the Mistral model architecture and is based on [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) pre-trained checkpoint, exemplifying a successful retrofitting of the pre-trained model weights. Moreover, we extend Mistral's vocabulary from 32k to 48k to offer a better character-per-token rate for Japanese.
|
@@ -100,4 +105,4 @@ For citing our work on the suite of RakutenAI-7B models, please use:
|
|
100 |
archivePrefix={arXiv},
|
101 |
primaryClass={cs.CL}
|
102 |
}
|
103 |
-
```
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
language:
|
4 |
+
- en
|
5 |
+
- ja
|
6 |
+
base_model:
|
7 |
+
- mistralai/Mistral-7B-v0.1
|
8 |
+
---
|
9 |
# RakutenAI-7B
|
10 |
## Model Description
|
11 |
RakutenAI-7B is a systematic initiative that brings the latest technologies to the world of Japanese LLMs. RakutenAI-7B achieves the best scores on the Japanese language understanding benchmarks while maintaining a competitive performance on the English test sets among similar models such as OpenCalm, Elyza, Youri, Nekomata and Swallow. RakutenAI-7B leverages the Mistral model architecture and is based on [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) pre-trained checkpoint, exemplifying a successful retrofitting of the pre-trained model weights. Moreover, we extend Mistral's vocabulary from 32k to 48k to offer a better character-per-token rate for Japanese.
|
|
|
105 |
archivePrefix={arXiv},
|
106 |
primaryClass={cs.CL}
|
107 |
}
|
108 |
+
```
|