Welcome Llama 4 Maverick & Scout on Hugging Face
We are incredibly excited to welcome the next generation of large language models from Meta to the Hugging Face Hub: Llama 4 Maverick (~400B) and Llama 4 Scout (~109B)! 🤗 Both are Mixture of Experts (MoE) models with 17B active parameters.
Released today, these powerful, natively multimodal models represent a significant leap forward. We’ve worked closely with Meta to ensure seamless integration into the Hugging Face ecosystem, including both transformers and TGI from day one.
This is just the start of our journey with Llama 4. Over the coming days we’ll continue to collaborate with the