T3Q-LLM-solar10.8-sft-v1.0
This model is a version of yanolja/EEVE-Korean-Instruct-10.8B-v1.0 that has been fine-tuned with SFT.
Model Developers Chihoon Lee(chihoonlee10), T3Q
hf (pretrained=T3Q-LLM/T3Q-LLM-solar10.8-sft-v1.0), limit: None, provide_description: False, num_fewshot: 0, batch_size: None
Task | Version | Metric | Value | Stderr | |
---|---|---|---|---|---|
kobest_boolq | 0 | acc | 0.9288 | ± | 0.0069 |
macro_f1 | 0.9286 | ± | 0.0069 | ||
kobest_copa | 0 | acc | 0.7440 | ± | 0.0138 |
macro_f1 | 0.7434 | ± | 0.0138 | ||
kobest_hellaswag | 0 | acc | 0.4880 | ± | 0.0224 |
acc_norm | 0.5600 | ± | 0.0222 | ||
macro_f1 | 0.4854 | ± | 0.0224 | ||
kobest_sentineg | 0 | acc | 0.8589 | ± | 0.0175 |
macro_f1 | 0.8589 | ± | 0.0175 |
hf (pretrained=yanolja/EEVE-Korean-Instruct-10.8B-v1.0), limit: None, provide_description: False, num_fewshot: 0, batch_size: None
Task | Version | Metric | Value | Stderr | |
---|---|---|---|---|---|
kobest_boolq | 0 | acc | 0.9188 | ± | 0.0073 |
macro_f1 | 0.9185 | ± | 0.0073 | ||
kobest_copa | 0 | acc | 0.7520 | ± | 0.0137 |
macro_f1 | 0.7516 | ± | 0.0136 | ||
kobest_hellaswag | 0 | acc | 0.4840 | ± | 0.0224 |
acc_norm | 0.5580 | ± | 0.0222 | ||
macro_f1 | 0.4804 | ± | 0.0223 | ||
kobest_sentineg | 0 | acc | 0.8514 | ± | 0.0179 |
macro_f1 | 0.8508 | ± | 0.0180 |
- Downloads last month
- 2,615
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.