Update README.md (#1)
Browse files- Update README.md (b35fc6ffd761a017a7289e7b1d4e51cbf5defc72)
Co-authored-by: Potsawee Manakul <[email protected]>
README.md
CHANGED
@@ -1,3 +1,4 @@
|
|
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
language:
|
@@ -7,40 +8,47 @@ pipeline_tag: text-generation
|
|
7 |
tags:
|
8 |
- pretrained
|
9 |
---
|
10 |
-
#
|
11 |
|
12 |
-
**Typhoon
|
13 |
|
14 |
-
**Typhoon
|
15 |
|
16 |
|
17 |
<div align="center">
|
18 |
-
|
19 |
</div>
|
20 |
|
21 |
-
For full details of this model please read our [paper]() and [release blog post]().
|
22 |
-
|
23 |
-
|
24 |
-
## Requirements
|
25 |
-
|
26 |
-
Transformers, 4.34.0 or newer.
|
27 |
|
28 |
|
29 |
-
## Model
|
|
|
|
|
|
|
|
|
30 |
|
31 |
-
|
32 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
33 |
|
34 |
-
##
|
35 |
-
Apache-2.0 (Commercial)
|
36 |
|
37 |
-
|
38 |
-
## Notice
|
39 |
-
|
40 |
-
Typhoon 7B is a pretrained base model; it cannot understand human instructions without using a few-shot or fine-tune approach on an instruct dataset,
|
41 |
-
and does not have any moderation mechanisms.
|
42 |
|
43 |
|
44 |
## SCB10X AI Team
|
45 |
|
46 |
-
Kunat Pipatanakul, Phatrasek Jirabovonvisut, Potsawee Manakul, Sittipong Sripaisarnmongkol, Ruangsak Patomwong, Pathomporn Chokchainant, Kasima Tharnpipitchai
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
|
2 |
---
|
3 |
license: apache-2.0
|
4 |
language:
|
|
|
8 |
tags:
|
9 |
- pretrained
|
10 |
---
|
11 |
+
# Typhoon-7B: Thai Large Language Model
|
12 |
|
13 |
+
**Typhoon-7B** is a *pretrained* Thai 🇹🇠large language model with 7 billion parameters, and it is based on Mistral-7B.
|
14 |
|
15 |
+
**Typhoon-7B** outperforms all open-source Thai language models at the time of writing as evaluated on Thai examination benchmarks, and its instruction-tuned variant achieves the best results in instruction-following tasks. Also, its performance in Thai is on par with GPT-3.5 while being 2.62 times more efficient in tokenizing Thai text.
|
16 |
|
17 |
|
18 |
<div align="center">
|
19 |
+
<img src="https://storage.googleapis.com/scb10x-ai-lab-public/assets/typhoon_benchmark.png" alt="Typhoon benchmark" width="100%" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
|
20 |
</div>
|
21 |
|
22 |
+
For full details of this model, please read our [paper]() and [release blog post]().
|
|
|
|
|
|
|
|
|
|
|
23 |
|
24 |
|
25 |
+
## Model Description
|
26 |
+
- **Model type**: A 7B pretrained decoder-only model
|
27 |
+
- **Requirement**: transformers 4.34.0 or newer.
|
28 |
+
- **Primary Language(s)**: Thai 🇹🇠and English 🇬🇧
|
29 |
+
- **License**: Apache-2.0 (Commercial)
|
30 |
|
31 |
+
## Performance on Thai Benchmark
|
32 |
|
33 |
+
| **Model** | **ONET** | **IC** | **TGAT** | **TPAT-1** | **A-Level** |
|
34 |
+
|---------------------|----------|--------|----------|------------|-------------|
|
35 |
+
| Typhoon-7B | 0.379 | 0.393 | 0.700 | 0.414 | 0.324 |
|
36 |
+
| SeaLLM-7B | 0.342 | 0.256 | 0.589 | 0.336 | 0.305 |
|
37 |
+
| OpenThaiGPT-beta-7B | 0.180 | 0.278 | 0.411 | 0.319 | 0.243 |
|
38 |
+
| WangChanGLM | 0.192 | 0.271 | 0.167 | 0.172 | 0.175 |
|
39 |
+
| SEA-LION-7B | 0.179 | 0.290 | 0.244 | 0.198 | 0.175 |
|
40 |
+
| Avg. Human | 0.318 | - | 0.472 | 0.406 | - |
|
41 |
|
42 |
+
## Intended Uses & Limitations
|
|
|
43 |
|
44 |
+
This model is a pretrained base model. Thus, it may not be able to follow human instructions without using one/few-shot learning or instruction fine-tuning. The model does not have any moderation mechanisms, and may generate harmful or inappropriate responses.
|
|
|
|
|
|
|
|
|
45 |
|
46 |
|
47 |
## SCB10X AI Team
|
48 |
|
49 |
+
- Kunat Pipatanakul, Phatrasek Jirabovonvisut, Potsawee Manakul, Sittipong Sripaisarnmongkol, Ruangsak Patomwong, Pathomporn Chokchainant, Kasima Tharnpipitchai
|
50 |
+
- Corresponding Author: [email protected]
|
51 |
+
- If you find Typhoon-7B useful for your work, please cite it using:
|
52 |
+
```
|
53 |
+
|
54 |
+
```
|