Implementing Transformers in NLP Under 5 Lines Of Codes

This article was published as a part of the Data Science Blogathon Introduction Today, we will see a gentle introduction to the transformers library for executing state-of-the-art models for complex NLP tasks. Applying state-of-the-art Natural Language Processing models has never been more straightforward. Hugging Face has revealed a compelling library called transformers that allow us to perform and use a broad class of state-of-the-art NLP models in a specific way. Today we are operating to install and use the transformers library […]

Read more

Introduction to Hugging Face’s Transformers v4.3.0 and its First Automatic Speech Recognition Model – Wav2Vec2

Overview Hugging Face has released Transformers v4.3.0 and it introduces the first Automatic Speech Recognition model to the library: Wav2Vec2 Using one hour of labeled data, Wav2Vec2 outperforms the previous state of the art on the 100-hour subset while using 100 times less labeled data Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data Wav2Vec2 achieves 4.8/8.2 WER Understand Wav2Vec2 implementation using transformers library on audio to text generation   Introduction Transformers has been […]

Read more

Implementation of Attention Mechanism for Caption Generation on Transformers using TensorFlow

Overview Learning about the state of the art model that is Transformers. Understand how we can implement Transformers on the already seen image captioning problem using Tensorflow Comparing the results of Transformers vs attention models.   Introduction We have seen that Attention mechanisms (in the previous article) have become an integral part of compelling sequence modeling and transduction models in various tasks (such as image captioning), allowing modeling of dependencies without regard to their distance in the input or output […]

Read more

Emotion classification on Twitter Data Using Transformers

Introduction The world of Natural language processing is recently overtaken by the invention of Transformers. Transformers are entirely indifferent to the conventional sequence-based networks. RNNs are the initial weapon used for sequence-based tasks like text generation, text classification, etc. But with the arrival of LSTM and GRU cells, the issue with capturing long-term dependency in the text got resolved. But learning the model with LSTM cells is a hard task as we cannot make it learn parallelly.

Read more

Out-of-the-box NLP functionalities for your project using Transformers Library!

This article was published as a part of the Data Science Blogathon. Introduction In this tutorial, you will learn how you can integrate common Natural Language Processing (NLP) functionalities into your application with minimal effort. We will be doing this using the ‘transformers‘ library provided by Hugging Face. 1. First, Install the transformers library. # Install the library !pip install transformers 2. Next, import the necessary functions. # Necessary imports from transformers import pipeline 3. Irrespective of the task that […]

Read more

A Comprehensive Guide to Build your own Language Model in Python!

Overview Language models are a crucial component in the Natural Language Processing (NLP) journey These language models power all the popular NLP applications we are familiar with – Google Assistant, Siri, Amazon’s Alexa, etc. We will go from basic language models to advanced ones in Python here   Introduction We tend to look through language and not realize how much power language has. Language is such a powerful medium of communication. We have the ability to build projects from scratch […]

Read more

Transfer Learning for NLP: Fine-Tuning BERT for Text Classification

Introduction With the advancement in deep learning, neural network architectures like recurrent neural networks (RNN and LSTM) and convolutional neural networks (CNN) have shown a decent improvement in performance in solving several Natural Language Processing (NLP) tasks like text classification, language modeling, machine translation, etc. However, this performance of deep learning models in NLP pales in comparison to the performance of deep learning in Computer Vision. One of the main reasons for this slow progress could be the lack of […]

Read more