Faster TensorFlow models in Hugging Face Transformers

Julien Plu's avatar


Open In Colab

In the last few months, the Hugging Face team has been working hard on improving Transformers’ TensorFlow models to make them more robust and faster. The recent improvements are mainly focused on two aspects:

  1. Computational performance: BERT, RoBERTa, ELECTRA and MPNet have been improved in order to have a much faster computation time. This

     

     

     

    To finish reading, please visit source site