Falcon 2: An 11B parameter pretrained language model and VLM, trained on over 5000B tokens and 11 languages



The Falcon 2 Models

TII is launching a new generation of models, Falcon 2, focused on providing the open-source community with a series of smaller models with enhanced performance and multi-modal support. Our goal is to enable cheaper inference and encourage the development of more downstream applications with improved usability.

The first generation of Falcon models, featuring Falcon-40B and Falcon-180B, made a significant contribution to the open-source community, promoting the

 

 

 

To finish reading, please visit source site