Back to Changelog
Jul 18, 25

Inference Providers now fully support OpenAI-compatible API

Our Inference Providers service now fully supports the OpenAI-compatible API, making it easy to integrate with existing workflows. Plus, you can now specify the provider name directly in the model path for greater flexibility.

This means you can effortlessly switch between different inference providers while using the familiar OpenAI client—just point to router.huggingface.co/v1 and include the provider in the model name.

This update simplifies integration while maintaining full compatibility with OpenAI's SDK. Try it now with supported providers like novita, groq, together, and more!

Using Kimi K2 with Groq Inference Provider via OpenAI client