How to host via Inference Endpoints?
#7
by
blevlabs
- opened
On the gradio demo for this IDEFICS model, the API unfortunately does not work. How could I host this model on the Inference Endpoints service?
I attempted to just host it by selecting the model directly, but unfortunately it was unable to start.
Any insights would be appreciated.