How to run in FP16
#11
by
Corny335
- opened
Hi,
is there a way to run it on a GPU with 24 GB e.g. by running it in FP16?
I tried it but it break in the image_tower_magma.py line: 375 ( x = self.clip_vision_model.trunk.stem(x))
Error Msg: Input type (float) and bias type (c10::Half) should be the same
Thanks
Cornelius
Just modified the sample code in readme:
https://huggingface.co/microsoft/Magma-8B/blob/main/README.md#how-to-get-started-with-the-model
Can you try again?
Works, thanks.
Corny335
changed discussion status to
closed