|
--- |
|
license: apache-2.0 |
|
datasets: |
|
- shibing624/alpaca-zh |
|
language: |
|
- zh |
|
tags: |
|
- LoRA |
|
- LLaMA |
|
- Alpaca |
|
- PEFT |
|
- int8 |
|
--- |
|
|
|
# Model Card for llama-7b-alpaca-zh-20k |
|
|
|
<!-- Provide a quick summary of what the model is/does. --> |
|
|
|
## Uses |
|
|
|
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> |
|
|
|
### Direct Use |
|
|
|
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> |
|
|
|
```python |
|
from peft import PeftModel |
|
from transformers import GenerationConfig, LlamaForCausalLM, LlamaTokenizer |
|
|
|
|
|
tokenizer = LlamaTokenizer.from_pretrained("decapoda-research/llama-7b-hf") |
|
model = LlamaForCausalLM.from_pretrained( |
|
"decapoda-research/llama-7b-hf", |
|
load_in_8bit=True, |
|
torch_dtype=torch.float16, |
|
device_map="auto" |
|
) |
|
model = PeftModel.from_pretrained( |
|
model, |
|
"DataAgent/llama-7b-alpaca-zh-20k", |
|
torch_dtype=torch.float16 |
|
) |
|
``` |
|
|