rtx 3090ti 24Gb VRAM, Cuda out of Memory error

#6
by tjohn8888 - opened

On my rtx 3090ti 24Gb VRAM, I get a Cuda out of Memory error when running app.py... How much video memory is required for this application?
Separately using TripoSG (without gradio interface) 3d mesh builds fine. But the whole problem is texture mapping.

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment