File size: 2,430 Bytes
1a05c4f b14439e d8159f8 1a05c4f d8159f8 1a05c4f d8159f8 1a05c4f d8159f8 1a05c4f d8159f8 1a05c4f d8159f8 1a05c4f d8159f8 1a05c4f d8159f8 1a05c4f d8159f8 1a05c4f d8159f8 1a05c4f d8159f8 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 |
---
library_name: transformers
license: apache-2.0
pipeline_tag: text-generation
datasets:
- maywell/ko_Ultrafeedback_binarized
base model:
- yanolja/EEVE-Korean-Instruct-10.8B-v1.0
---
![image/png](https://cdn-uploads.huggingface.co/production/uploads/65f22e4076fedc4fd11e978f/MoTedec_ZL8GM2MmGyAPs.png)
# T3Q-LLM-MG-v1.0
## Model Developers Chihoon Lee(chihoonlee10), T3Q
### Python code
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, TextStreamer
MODEL_DIR = "chihoonlee10/T3Q-LLM-MG-v1.0"
model = AutoModelForCausalLM.from_pretrained(MODEL_DIR, torch_dtype=torch.float16).to("cuda")
tokenizer = AutoTokenizer.from_pretrained(MODEL_DIR)
streamer = TextStreamer(tokenizer, skip_prompt=True, skip_special_tokens=True)
s = "한국의 수도는 어디?"
conversation = [{'role': 'user', 'content': s}]
inputs = tokenizer.apply_chat_template(
conversation,
tokenize=True,
add_generation_prompt=True,
return_tensors='pt').to("cuda")
_ = model.generate(inputs, streamer=streamer, max_new_tokens=1024)
```
hf (pretrained=chihoonlee10/T3Q-LLM-MG-v1.0), limit: None, provide_description: False, num_fewshot: 0, batch_size: None
| Task |Version| Metric |Value | |Stderr|
|----------------|------:|--------|-----:|---|-----:|
|kobest_boolq | 0|acc |0.9523|± |0.0057|
| | |macro_f1|0.9523|± |0.0057|
|kobest_copa | 0|acc |0.7740|± |0.0132|
| | |macro_f1|0.7737|± |0.0133|
|kobest_hellaswag| 0|acc |0.4980|± |0.0224|
| | |acc_norm|0.5920|± |0.0220|
| | |macro_f1|0.4950|± |0.0223|
|kobest_sentineg | 0|acc |0.7254|± |0.0224|
| | |macro_f1|0.7106|± |0.0234|
### T3Q-LLM/T3Q-LLM-sft1.0-dpo1.0
| Task |Version| Metric |Value | |Stderr|
|----------------|------:|--------|-----:|---|-----:|
|kobest_boolq | 0|acc |0.9387|± |0.0064|
| | |macro_f1|0.9387|± |0.0064|
|kobest_copa | 0|acc |0.7590|± |0.0135|
| | |macro_f1|0.7585|± |0.0135|
|kobest_hellaswag| 0|acc |0.5080|± |0.0224|
| | |acc_norm|0.5580|± |0.0222|
| | |macro_f1|0.5049|± |0.0224|
|kobest_sentineg | 0|acc |0.8489|± |0.0180|
| | |macro_f1|0.8483|± |0.0180|
|