klyang commited on
Commit
36cef0e
·
1 Parent(s): 9e16948

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -6
README.md CHANGED
@@ -52,16 +52,16 @@ In addition to MentaLLaMA-33B-lora, the MentaLLaMA project includes another mode
52
  You can use the MentaLLaMA-33B-lora model in your Python project with the Hugging Face Transformers library. Here is a simple example of how to load the model:
53
 
54
  Since our model is based on the Vicuna-33B foundation model, you need to first download the Vicuna-33B model [here](https://huggingface.co/lmsys/vicuna-33b-v1.3),
55
- and put it under the `./Vicuna-33B` dir.
56
 
57
  ```python
58
- from transformers import LlamaTokenizer, LlamaForCausalLM
59
- tokenizer = LlamaTokenizer.from_pretrained('klyang/MentaLLaMA-chat-13B')
60
- model = LlamaForCausalLM.from_pretrained('klyang/MentaLLaMA-chat-13B', device_map='auto')
 
61
  ```
62
 
63
- In this example, LlamaTokenizer is used to load the tokenizer, and LlamaForCausalLM is used to load the model. The `device_map='auto'` argument is used to automatically
64
- use the GPU if it's available.
65
 
66
  ## License
67
 
 
52
  You can use the MentaLLaMA-33B-lora model in your Python project with the Hugging Face Transformers library. Here is a simple example of how to load the model:
53
 
54
  Since our model is based on the Vicuna-33B foundation model, you need to first download the Vicuna-33B model [here](https://huggingface.co/lmsys/vicuna-33b-v1.3),
55
+ and put it under the `./vicuna-33B` dir. Then download the MentaLLaMA-33B-lora weights and put it under the `./MentaLLaMA-33B-lora` dir.
56
 
57
  ```python
58
+ from peft import AutoPeftModelForCausalLM
59
+ from transformers import AutoTokenizer
60
+ peft_model = AutoPeftModelForCausalLM.from_pretrained("./MentaLLaMA-33B-lora")
61
+ tokenizer = AutoTokenizer.from_pretrained('./MentaLLaMA-33B-lora')
62
  ```
63
 
64
+ In this example, AutoPeftModelForCausalLM can automatically load the base model and the lora weights from the downloaded dir, and AutoTokenizer can load the tokenizer.
 
65
 
66
  ## License
67