Multiple GPUs

#3
by BigDeeper - opened

I need a solution that can split a Wan model over 4 GPUs' VRAM (each is 12.2GiB)

Any suggestions?

Something similar to what Ollama does with LLMs.

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment