DeepInfra on Hugging Face Inference Providers 🔥

banner image

We’re thrilled to share that DeepInfra is now a supported Inference Provider on the Hugging Face Hub!

DeepInfra joins our growing ecosystem, enhancing the breadth and capabilities of serverless inference directly on the Hub’s model pages. Inference Providers are also seamlessly integrated into our client SDKs (for both JS and Python), making it super easy to use a wide variety of models with your preferred providers.

DeepInfra is a serverless AI inference platform offering one of the most cost-effective pricing per token in the industry. With a catalog of over 100 models, DeepInfra makes it easy for developers to integrate a wide range of AI capabilities into their applications with minimal setup.

DeepInfra supports a broad spectrum of model types – from LLMs to text-to-image, text-to-video, embeddings, and more. As part of this initial integration, DeepInfra is launching support for conversational and text-generation tasks on Hugging Face, enabling access to popular open-weight LLMs such as DeepSeek V4, Kimi-K2.6, GLM-5.1, and many more. Support for additional tasks (text-to-image, text-to-video, embeddings, and more) will roll out soon!

Read more about how to use DeepInfra as an Inference Provider in its

 

 

 

To finish reading, please visit source site