GGUF
llama.cpp
Inference Endpoints

[7b-it-GGUF] mismatch in special tokens definition

#2
by dvappco - opened

Number of defined special tokens differs between what was initially expected and what's currently being used. i'm unsure whether this discrepancy could potentially cause any issues during model inference?

llm_load_vocab: mismatch in special tokens definition ( 544/256128 vs 388/256128 )

Google org

I'm not aware of any issues yet.

Sign up or log in to comment