compile for mlc-llm source project: https://huggingface.co/GeneZC/MiniChat-1.5-3B

Downloads last month
0
Inference Examples
Inference API (serverless) does not yet support mlc-llm models for this pipeline type.