Text Generation
Transformers
English
llama
Inference Endpoints

Colab no pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack file

#27
by Danis457845 - opened

Hello:)
I am using colab to load and run the model with the "standard" code to load it:

Load model directly

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("reeducator/vicuna-13b-free", use_fast = False)
model = AutoModelForCausalLM.from_pretrained("reeducator/vicuna-13b-free")

But I receive this error:

Requirement already satisfied: sentencepiece in /usr/local/lib/python3.10/dist-packages (0.1.99)

OSError Traceback (most recent call last)
in <cell line: 15>()
13
14 tokenizer = AutoTokenizer.from_pretrained("reeducator/vicuna-13b-free", use_fast = False)
---> 15 model = AutoModelForCausalLM.from_pretrained("reeducator/vicuna-13b-free")
16
17 # Define the API endpoint for model inference

1 frames
/usr/local/lib/python3.10/dist-packages/transformers/modeling_utils.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
2553 )
2554 else:
-> 2555 raise EnvironmentError(
2556 f"{pretrained_model_name_or_path} does not appear to have a file named"
2557 f" {_add_variant(WEIGHTS_NAME, variant)}, {TF2_WEIGHTS_NAME}, {TF_WEIGHTS_NAME} or"

OSError: reeducator/vicuna-13b-free does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.

Can someone help please?

Sign up or log in to comment