Evolution Strategies From Scratch in Python

Evolution strategies is a stochastic global optimization algorithm. It is an evolutionary algorithm related to others, such as the genetic algorithm, although it is designed specifically for continuous function optimization. In this tutorial, you will discover how to implement the evolution strategies optimization algorithm. After completing this tutorial, you will know: Evolution Strategies is a stochastic global optimization algorithm inspired by the biological theory of evolution by natural selection. There is a standard terminology for Evolution Strategies and two common […]

Read more

Basics of Natural Language Processing(NLP) for Absolute Beginners

Introduction According to industry estimates, only 21% of the available data is present in a structured form. Data is being generated as we speak, as we tweet, as we send messages on WhatsApp and in various other activities. The majority of this data exists in the textual form, which is highly unstructured in nature.  Despite having high dimension data, the information present in it is not directly accessible unless it is processed (read and understood) manually or analyzed by an […]

Read more

HackerNoon Interview

This post is an interview by fast.ai fellow Sanyam Bhutani with me. This post originally appeared at HackerNoon with a different introduction. I had the honour to be interviewed by Sanyam Bhutani, a Deep Learning and Computer Vision practitioner and fast.ai fellow who’s been doing a series interviewing people that inspire him. To be honest, it feels surreal to be the one being interviewed. I hope my answers may be interesting or useful to some of you. Sanyam: Hello Sebastian, […]

Read more

EMNLP 2018 Highlights: Inductive bias, cross-lingual learning, and more

The post discusses highlights of the 2018 Conference on Empirical Methods in Natural Language Processing (EMNLP 2018). This post originally appeared at the AYLIEN blog. You can find past highlights of conferences here. You can find all 549 accepted papers in the EMNLP proceedings. In this review, I will focus on papers that relate to the following topics: Inductive bias The inductive bias of a machine learning algorithm is the set of assumptions that the model makes in order to […]

Read more

10 Exciting Ideas of 2018 in NLP

This post gathers 10 ideas that I found exciting and impactful this year—and that we’ll likely see more of in the future. For each idea, I will highlight 1-2 papers that execute them well. I tried to keep the list succinct, so apologies if I did not cover all relevant work. The list is necessarily subjective and covers ideas mainly related to transfer learning and generalization. Most of these (with some exceptions) are not trends (but I suspect that some […]

Read more

The 4 Biggest Open Problems in NLP

This post discusses 4 major open problems in NLP based on an expert survey and a panel discussion at the Deep Learning Indaba. This is the second blog post in a two-part series. The series expands on the Frontiers of Natural Language Processing session organized by Herman Kamper, Stephan Gouws, and me at the Deep Learning Indaba 2018. Slides of the entire session can be found here. The first post discussed major recent advances in NLP focusing on neural network-based […]

Read more

Neural Transfer Learning for Natural Language Processing (PhD thesis)

I finally got around to submitting my thesis. The thesis touches on the four areas of transfer learning that are most prominent in current Natural Language Processing (NLP): domain adaptation, multi-task learning, cross-lingual learning, and sequential transfer learning. Most of the work in the thesis has been previously presented (see Publications). Nevertheless, there are some new parts as well. The most notable are: a background chapter (§2) that lays out key concepts in terms of probability and information theory, machine […]

Read more

The State of Transfer Learning in NLP

Update 16.10.2020: Added Chinese and Spanish translations. This post expands on the NAACL 2019 tutorial on Transfer Learning in NLP. The tutorial was organized by Matthew Peters, Swabha Swayamdipta, Thomas Wolf, and me. In this post, I highlight key insights and takeaways and provide updates based on recent work. You can see the structure of this post below: The slides, a Colaboratory notebook, and code of the tutorial are available online. For an overview of what transfer learning is, have […]

Read more

Unsupervised Cross-lingual Representation Learning

This post expands on the ACL 2019 tutorial on Unsupervised Cross-lingual Representation Learning. The tutorial was organised by Ivan Vulić, Anders Søgaard, and me. In this post, I highlight key insights and takeaways and provide additional context and updates based on recent work. In particular, I cover unsupervised deep multilingual models such as multilingual BERT. You can see the structure of this post below: The slides of the tutorial are available online. Cross-lingual representation learning can be seen as an […]

Read more

10 ML & NLP Research Highlights of 2019

This post gathers ten ML and NLP research directions that I found exciting and impactful in 2019. For each highlight, I summarise the main advances that took place this year, briefly state why I think it is important, and provide a short outlook to the future. The full list of highlights is here: Universal unsupervised pretraining Lottery tickets The Neural Tangent Kernel Unsupervised multilingual learning More robust benchmarks ML and NLP for science Fixing decoding errors in NLG Augmenting pretrained […]

Read more
1 2 3 4 10