strange messsage from using the model

#18
by lucas202 - opened

You are using a model of type minimax_text_01 to instantiate a model of type MiniMaxText01. This is not supported for all configurations of models and can yield errors.

MiniMax org

Hello, we couldn't reproduce the issue. Could you provide more detailed information (such as the code location where the error occurs, error messages, etc.) to help us debug?

i appreciate your feedback, but how about you tell me the hardware specs that i need to host this llm model locally? for instance, using llamacpp.

if that is possible. how much ram? how much vram? recommended machine specs? because if i blindly run the script to host this llm, it will not work.

is it even possible to host a 671B model like this one on a consumer computer? at all?

Sign up or log in to comment