Introduction to PyTorch-Transformers: An Incredible Library for State-of-the-Art NLP (with Python code)

Overview

  • We look at the latest state-of-the-art NLP library in this article called PyTorch-Transformers
  • We will also implement PyTorch-Transformers in Python using popular NLP models like Google’s BERT and OpenAI’s GPT-2!
  • This has the potential to revolutionize the landscape of NLP as we know it

 

Introduction

“NLP’s ImageNet moment has arrived.” – Sebastian Ruder

Imagine having the power to build the Natural Language Processing (NLP) model that powers Google Translate. What if I told you this can be done using just a few lines of code in Python? Sounds like an incredibly exciting opportunity.

Well – we can now do this sitting in front of our own machines! The latest state-of-the-art NLP release is called PyTorch-Transformers by the folks at HuggingFace. This PyTorch-Transformers library was actually released just yesterday and I’m thrilled to present my first impressions along with the Python code.

The ability to harness this research would have taken a combination of years, some of the best minds, as well as extensive resources to be

 

 

 

To finish reading, please visit source site

Leave a Reply