Groq on Hugging Face Inference Providers 🔥

banner image

We’re thrilled to share that Groq is now a supported Inference Provider on the Hugging Face Hub!
Groq joins our growing ecosystem, enhancing the breadth and capabilities of serverless inference directly on the Hub’s model pages. Inference Providers are also seamlessly integrated into our client SDKs (for both JS and Python), making it super easy to use a wide variety of models with your preferred providers.

Groq supports a wide variety of text and conversational models, including the latest open-source models such as Meta’s Llama 4, Qwen’s QWQ-32B, and

 

 

 

To finish reading, please visit source site