Update README.md
Browse files
README.md
CHANGED
@@ -6,4 +6,33 @@ library_name: transformers
|
|
6 |
pipeline_tag: text-generation
|
7 |
tags:
|
8 |
- pretrained
|
9 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
6 |
pipeline_tag: text-generation
|
7 |
tags:
|
8 |
- pretrained
|
9 |
+
---
|
10 |
+
# Model Card for Typhoon-7B
|
11 |
+
|
12 |
+
Typhoon 7B is a pretrained Thai language adaption of Mistral-7B with 7 billion parameters.
|
13 |
+
|
14 |
+
Typhoon 7B outperforms all open-source Thai language models, and its performance is on par with GPT-3.5 while being 2.62 times more efficient.
|
15 |
+
|
16 |
+
[SHOW_RESULT_IMAGE_HERE]
|
17 |
+
|
18 |
+
For full details of this model please read our [paper]() and [release blog post]().
|
19 |
+
|
20 |
+
|
21 |
+
## Requirements
|
22 |
+
|
23 |
+
Transformers, 4.34.0 or newer.
|
24 |
+
|
25 |
+
|
26 |
+
## License
|
27 |
+
Apache-2.0 (Commercial)
|
28 |
+
|
29 |
+
|
30 |
+
## Notice
|
31 |
+
|
32 |
+
Typhoon 7B is a pretrained base model; it cannot understand human instructions without using a few-shot or fine-tune approach on an instruct dataset,
|
33 |
+
and does not have any moderation mechanisms.
|
34 |
+
|
35 |
+
|
36 |
+
## SCB10X AI Team
|
37 |
+
|
38 |
+
Kunat Pipatanakul, Phatrasek Jirabovonvisut, Potsawee Manakul, Sittipong Sripaisarnmongkol, Ruangsak Patomwong, Pathomporn Chokchainant, Kasima Tharnpipitchai
|