Out-of-the-box NLP functionalities for your project using Transformers Library!

This article was published as a part of the Data Science Blogathon. Introduction In this tutorial, you will learn how you can integrate common Natural Language Processing (NLP) functionalities into your application with minimal effort. We will be doing this using the ‘transformers‘ library provided by Hugging Face. 1. First, Install the transformers library. # Install the library !pip install transformers 2. Next, import the necessary functions. # Necessary imports from transformers import pipeline 3. Irrespective of the task that […]

Read more

Analysis of Brazilian E-commerce Text Review Dataset Using NLP and Google Translate

This article was published as a part of the Data Science Blogathon. Introduction Comprehending the reviews of customers is very crucial for a business to be successful. Analyzing the reviews helps to properly discern the customer different preferences, likes, dislikes, etc. These extracted insights can then be used to improve customer service and experience.  In this article, we would be working on a Brazilian E-commerce reviews dataset where we would perform some exploratory data analysis (EDA) on reviews text, derive […]

Read more

Introduction to Automatic Speech Recognition and Natural Language Processing

This article was published as a part of the Data Science Blogathon. Introduction In this article, we will take a closer look at how speech recognition really works. Now, when we say speech recognition, we’re really talking about ASR, or automatic speech recognition. With automatic speech recognition, the goal is to simply input any continuous audio speech and output the text equivalent. We want our ASR to be speaker-independent and have high accuracy. Such a system has long been a […]

Read more

GPT-3 THE NEXT BIG THING! Foundation of Future?

This article was published as a part of the Data Science Blogathon. Introduction Did you ever have a thought or a wish that you just wanted to write two lines of an essay or a journal and the computer just wrote the rest for you? If yes, then GPT-3 is the answer for you. Baffled? So are the people who got their hands on the GPT-3. Every field in AI is making advancements and NLP & Deep learning are such […]

Read more

Fake news classifier on US Election News📰 | LSTM 🈚

Introduction News media has become a channel to pass on the information of what’s happening in the world to the people living. Often people perceive whatever conveyed in the news to be true. There were circumstances where even the news channels acknowledged that their news is not true as they wrote. But some news has a significant impact not only on the people or    

Read more

Hands-On Tutorial on Stack Overflow Question Tagging

This article was published as a part of the Data Science Blogathon. Background I won’t be lying if I assert that every developer/engineer/student has used the website Stack Overflow more than once in their journey. Widely considered as one of the largest and more trusted websites for developers to learn and share their knowledge, the website presently hosts in excess of 10,000,000 questions. In this post, we try to predict the question tags based on the question text asked on […]

Read more

Top 5 Machine Learning GitHub Repositories & Reddit Discussions (October 2018)

Introduction “Should I use GitHub for my projects?” – I’m often asked this question by aspiring data scientists. There’s only one answer to this – “Absolutely!”. GitHub is an invaluable platform for data scientists looking to stand out from the crowd. It’s an online resume for displaying your code to recruiters and other fellow professionals. The fact that GitHub hosts open-source projects from the top tech behemoths like Google, Facebook, IBM, NVIDIA, etc. is what adds to the gloss of […]

Read more

How do Transformers Work in NLP? A Guide to the Latest State-of-the-Art Models

Overview The Transformer model in NLP has truly changed the way we work with text data Transformer is behind the recent NLP developments, including Google’s BERT Learn how the Transformer idea works, how it’s related to language modeling, sequence-to-sequence modeling, and how it enables Google’s BERT model   Introduction I love being a data scientist working in Natural Language Processing (NLP) right now. The breakthroughs and developments are occurring at an unprecedented pace. From the super-efficient ULMFiT framework to Google’s […]

Read more

Summarize Twitter Live data using Pretrained NLP models

Introduction Twitter users spend an average of 4 minutes on social media Twitter. On an average of 1 minute, they read the same stuff. It shows that users spend around 25% of their time reading the same stuff. Also, most of the tweets will not appear on your dashboard. You may get to know the trending topics, but you miss not trending topics. In trending topics, you might only read the top 5 tweets and their comments. So, what are […]

Read more

Tired of Reading Long Articles? Text Summarization will make your task easier!

This article was published as a part of the Data Science Blogathon. Introduction Millions of web pages and websites exist on the Internet today. Going through a vast amount of content becomes very difficult to extract information on a certain topic. Google will filter the search results and give you the top ten search results, but often you are unable to find the right content that you need. There is a lot of redundant and overlapping data in the articles […]

Read more
1 8 9 10 11 12 14