Question Answering
PEFT
English
Marcus Cedric R. Idia commited on
Commit
ac6dbde
·
1 Parent(s): bb8ceb9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -14
README.md CHANGED
@@ -39,6 +39,8 @@ from peft import LoraConfig, get_peft_model
39
  import torch
40
  from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig
41
 
 
 
42
  # Load LLAMA 2 model
43
  model_name = "meta-llama/Llama-2-13b-chat-hf"
44
 
@@ -78,17 +80,3 @@ print(tokenizer.decode(outputs[0], skip_special_tokens=True))
78
  This loads the LLAMA 2 model, applies 4-bit quantization and LoRA optimizations, constructs a prompt, and generates a response.
79
 
80
  See the [docs](https://huggingface.co/docs/transformers/model_doc/auto#transformers.AutoModelForCausalLM) for more details.
81
-
82
- ## Training
83
-
84
- The model was trained by Anthropic using self-supervised learning. See the [model card](https://huggingface.co/USERNAME/archimedes) for details.
85
-
86
- ## License
87
-
88
- Archimedes is released under the Apache 2.0 license.
89
-
90
- ## Citation
91
-
92
- Coming soon!
93
-
94
- Please ⭐ if this repository was helpful!
 
39
  import torch
40
  from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig
41
 
42
+ login() # Need access to the gated model.
43
+
44
  # Load LLAMA 2 model
45
  model_name = "meta-llama/Llama-2-13b-chat-hf"
46
 
 
80
  This loads the LLAMA 2 model, applies 4-bit quantization and LoRA optimizations, constructs a prompt, and generates a response.
81
 
82
  See the [docs](https://huggingface.co/docs/transformers/model_doc/auto#transformers.AutoModelForCausalLM) for more details.