--- license: cc-by-nc-4.0 language: - ko library_name: transformers pipeline_tag: text-generation --- **The license is `cc-by-nc-4.0`.** # **GAI-LLM/OPEN-SOLAR-KO-10.7B-mixed-v15** ## Model Details **Model Developers** Donghoon Oh, Hanmin Myung, Eunyoung Kim (SK C&C G.AI Eng) **Input** Models input text only. **Output** Models generate text only. **Model Architecture** GAI-LLM/OPEN-SOLAR-KO-10.7B-mixed-v15 is an auto-regressive language model based on the LLaMA2 transformer architecture. **Base Model** [beomi/OPEN-SOLAR-KO-10.7B](https://huggingface.co/beomi/OPEN-SOLAR-KO-10.7B) **Training Dataset** - We combined Open Korean Dateset using mixed-strategy - We use A100 GPU 80GB * 8, when training. # **Model Benchmark** ## KO-LLM leaderboard - Follow up as [Open KO-LLM LeaderBoard](https://huggingface.co/spaces/upstage/open-ko-llm-leaderboard). # Implementation Code ```python ### GAI-LLM/OPEN-SOLAR-KO-10.7B-mixed-v15 from transformers import AutoModelForCausalLM, AutoTokenizer import torch repo = "GAI-LLM/OPEN-SOLAR-KO-10.7B-mixed-v15" model = AutoModelForCausalLM.from_pretrained( repo, return_dict=True, torch_dtype=torch.float16, device_map='auto' ) tokenizer = AutoTokenizer.from_pretrained(repo) ```