|
--- |
|
license: apache-2.0 |
|
tags: |
|
- generated_from_trainer |
|
base_model: meta-llama/Meta-Llama-3-8B-Instruct |
|
model-index: |
|
- name: fiveflow/KoLlama-3-8B-Instruct |
|
results: [] |
|
language: |
|
- ko |
|
--- |
|
|
|
How to use |
|
|
|
```Python |
|
from transformers import AutoModelForCausalLM, AutoTokenizer, TextGenerationPipeline |
|
model_path = 'fiveflow/KoLlama-3-8B-Instruct' |
|
tokenizer = AutoTokenizer.from_pretrained(model_path) |
|
model = AutoModelForCausalLM.from_pretrained(model_path, |
|
device_map="auto", |
|
# load_in_4bit=True, |
|
low_cpu_mem_usage=True) |
|
|
|
pipe = TextGenerationPipeline(model = model, tokenizer = tokenizer) |
|
``` |