|
--- |
|
license: apache-2.0 |
|
--- |
|
**Typhoon-0130: Thai Large Language Model (Instruct)** |
|
|
|
**Typhoon-0130** is a *instruct* Thai ๐น๐ญ large language model with 7 billion parameters, and it is based on Typhoon 7B. It is the first instruct model version that serves [opentyphoon.ai](http://opentyphoon.ai/). It follows instructions well using a similar fine-tuning technique and dataset as [ORCA](https://arxiv.org/abs/2306.02707) and [OpenChat](https://huggingface.co/openchat/openchat_3.5); however, it does not support prompting using system prompts. |
|
|
|
## **Model Description** |
|
|
|
- **Model type**: A 7B instruct decoder-only model based on Mistral architecture. |
|
- **Requirement**: transformers 4.38.0 or newer. |
|
- **Primary Language(s)**: Thai ๐น๐ญ and English ๐ฌ๐ง |
|
- **License**: Apache-2.0 |
|
|
|
## Production Deployment |
|
|
|
We suggest using the OpenAI-compatible API server from the [vLLM](https://github.com/vllm-project/vllm) project. |
|
|
|
```python |
|
python -m vllm.entrypoints.openai.api_server --port 8080 --model scb10x/typhoon-7b-instruct-01-30-2024 --max-num-batched-tokens 8192 --max-model-len 8192 --served-model-name typhoon-instruct |
|
``` |
|
|
|
## Chat Template |
|
|
|
We use chatml chat-template. |
|
|
|
```python |
|
{% for message in messages %}{{'<|im_start|>' + message['role'] + '\n' + message['content']}}{% if (loop.last and add_generation_prompt) or not loop.last %}{{ '<|im_end|>' + '\n'}}{% endif %}{% endfor %} |
|
{% if add_generation_prompt and messages[-1]['role'] != 'assistant' %}{{ '<|im_start|>assistant\n' }}{% endif %} |
|
``` |
|
|
|
## **Intended Uses & Limitations** |
|
|
|
This model is an instructional model. However, itโs still undergoing development. It incorporates some level of guardrails, but it still may produce answers that are inaccurate, biased, or otherwise objectionable in response to user prompts. We recommend that developers assess these risks in the context of their use case. |
|
|
|
## **Follow us** |
|
|
|
**https://twitter.com/opentyphoon** |
|
|
|
## **Support** |
|
|
|
**https://discord.gg/CqyBscMFpg** |
|
|
|
## **SCB10X AI Team** |
|
|
|
- Kunat Pipatanakul, Potsawee Manakul, Sittipong Sripaisarnmongkol, Pathomporn Chokchainant, Kasima Tharnpipitchai |
|
- If you find Typhoon useful for your work, please cite it using: |
|
|
|
``` |
|
@article{pipatanakul2023typhoon, |
|
title={Typhoon: Thai Large Language Models}, |
|
author={Kunat Pipatanakul and Phatrasek Jirabovonvisut and Potsawee Manakul and Sittipong Sripaisarnmongkol and Ruangsak Patomwong and Pathomporn Chokchainant and Kasima Tharnpipitchai}, |
|
year={2023}, |
|
journal={arXiv preprint arXiv:2312.13951}, |
|
url={https://arxiv.org/abs/2312.13951} |
|
} |
|
``` |
|
|
|
## **Contact Us** |
|
|
|
- General & Collaboration: **[kasima@scb10x.com](mailto:kasima@scb10x.com)**, **[pathomporn@scb10x.com](mailto:pathomporn@scb10x.com)** |
|
- Technical: **[kunat@scb10x.com](mailto:kunat@scb10x.com)** |