SuperkingbasSKB commited on
Commit
622eef3
•
1 Parent(s): 78e9114

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -16,10 +16,10 @@ tags:
16
  - code
17
  - legal
18
  ---
19
- # OpenThaiLLM-Prebuilt: Thai & China & English Large Language Model
20
- **OpenThaiLLM-Prebuilt** is an 7 billion parameter instruct model designed for Thai 🇹🇭 & China 🇨🇳 language.
21
  It demonstrates an amazing result, and is optimized for application use cases, Retrieval-Augmented Generation (RAG), Web deployment
22
- constrained generation, and reasoning tasks.is a Thai 🇹🇭 & China 🇨🇳 large language model with 7 billion parameters, and it is based on Qwen2.5-7B.
23
  ## Introduction
24
 
25
  Qwen2.5 is the new series of Qwen large language models. For Qwen2, we release a number of base language models and instruction-tuned language models ranging from 0.5 to 72 billion parameters, including a Mixture-of-Experts model. This repo contains the instruction-tuned 7B Qwen2 model.
 
16
  - code
17
  - legal
18
  ---
19
+ # OpenThaiLLM-DoodNiLT-V1.0.0-Beta-7B: Thai & China & English Large Language Model
20
+ **OpenThaiLLM-DoodNiLT-V1.0.0-Beta-7B** is an 7 billion parameter instruct model designed for Thai 🇹🇭 & China 🇨🇳 language.
21
  It demonstrates an amazing result, and is optimized for application use cases, Retrieval-Augmented Generation (RAG), Web deployment
22
+ constrained generation, and reasoning tasks.is a Thai 🇹🇭 & China 🇨🇳 large language model with 7 billion parameters, and it is based on Qwen2-7B.
23
  ## Introduction
24
 
25
  Qwen2.5 is the new series of Qwen large language models. For Qwen2, we release a number of base language models and instruction-tuned language models ranging from 0.5 to 72 billion parameters, including a Mixture-of-Experts model. This repo contains the instruction-tuned 7B Qwen2 model.