Bug in fp8_cast_bf16.py
#4
by
dzhulgakov
- opened
Hi, really excited to try out this model!
I think there's a bug in the fp8 to bf16 conversion script, where it assumes that the inv_scale tensor is in the same safetensor file as the main one.
I made a patch here: https://gist.github.com/YLGH/3efd725c425b110a6eaa491474ca488a, which just does this via the weight_map and lazily loading safe tensors.
fixed in main branch
dzhulgakov
changed discussion status to
closed