Is vllm serving supported?
#2
by
comefeel
- opened
I have confirmed that the model is working properly.
However, I attempted to run it with vllm to check the throughput, but the attempt failed.
Is vllm serving supported?
If so, please share the optimal configuration settings.
i need too.
We thank you for your interest in A.X 4.0 VL Light.
At this time, vLLM serving is not supported. However, we are considering adding support in the future.
If you have any further questions, please don’t hesitate to let us know.