Some bugs in weight.

#2
by yixuantt - opened

Hi, when I load this model using the same method with llama-3.1 8B instruct. There is a bug:

Some weights of LlamaForCausalLM were not initialized from the model checkpoint at Psychotherapy-LLM/PsychoCounsel-Llama3-8B and are newly initialized: ['lm_head.weight']

The output is also not readable. For example:

Universities Blade Ont unnecessarilyيل consequently фарPopularPopular fluidしたら Angeurret.routesimaryнок.nasa(\"%splashdosDiesurret intoxicated jumlah\\Controllers tapiädchenолCONDS맥еляَم відбуваしたら_growthしたら_nom openingsolumn Oslo)./@Module Hague
Psychotherapy with Large Language Models org

Hi Yixuan,

Sorry for the mistake of uploading the model. I have fixed the model weights. Please try it again.

Best

Now it works fine. Thanks for the wonderful work!

yixuantt changed discussion status to closed

Sign up or log in to comment