Cannot load the model
Hi, I am having trouble loading the tokenizer. My codes are as follows:
tokenizer = AutoTokenizer.from_pretrained('Chanblock/Photolens-llama-2-7b-langchain-chat-fine-tuning',use_fast=False)
But it returned the following traceback:
TypeError Traceback (most recent call last)
in <cell line: 15>()
13 #tokenizer = AutoTokenizer.from_pretrained('Photolens/llama-2-7b-langchain-chat',use_fast=False)
14
---> 15 tokenizer = AutoTokenizer.from_pretrained('Chanblock/Photolens-llama-2-7b-langchain-chat-fine-tuning',use_fast=False)
16 model = AutoModelForCausalLM.from_pretrained('Chanblock/Photolens-llama-2-7b-langchain-chat-fine-tuning',device_map='auto',torch_dtype=torch.float16)
17
4 frames
/usr/local/lib/python3.10/dist-packages/transformers/models/llama/tokenization_llama.py in get_spm_processor(self, from_slow)
197 return tokenizer
198
--> 199 with open(self.vocab_file, "rb") as f:
200 sp_model = f.read()
201 model_pb2 = import_protobuf(f"The new behaviour of {self.class.name} (with self.legacy = False
)")
TypeError: expected str, bytes or os.PathLike object, not NoneType