The Transformers Library: standardizing model definitions
TLDR: Going forward, we’re aiming for Transformers to be the pivot across frameworks: if a model architecture is
supported by transformers, you can expect it to be supported in the rest of the ecosystem.
supported by transformers, you can expect it to be supported in the rest of the ecosystem.
Transformers was created in 2019, shortly following the release of the BERT Transformer model. Since then, we’ve
continuously aimed to add state-of-the-art architectures, initially focused on NLP, then growing to Audio and
computer vision. Today, transformers is the default library for LLMs and VLMs in the Python ecosystem.
Transformers now supports 300+ model architectures, with an average of ~3 new architectures added every week.
We