Moving NLP Forward With Transformer Models and Attention


Manage episode 337660588 series 2637014
By Real Python. Discovered by Player FM and our community — copyright is owned by the publisher, not Player FM, and audio is streamed directly from their servers. Hit the Subscribe button to track updates in Player FM, or paste the feed URL into other podcast apps.

What’s the big breakthrough for Natural Language Processing (NLP) that has dramatically advanced machine learning into deep learning? What makes these transformer models unique, and what defines “attention?” This week on the show, Jodie Burchell, developer advocate for data science at JetBrains, continues our talk about how machine learning (ML) models understand and generate text.

This episode is a continuation of the conversation in episode #119. Jodie builds on the concepts of bag-of-words, word2vec, and simple embedding models. We talk about the breakthrough mechanism called “attention,” which allows for parallelization in building models.

We also discuss the two major transformer models, BERT and GPT3. Jodie continues to share multiple resources to help you continue exploring modeling and NLP with Python.

Course Spotlight: Building a Neural Network & Making Predictions With Python AI

In this step-by-step course, you’ll build a neural network from scratch as an introduction to the world of artificial intelligence (AI) in Python. You’ll learn how to train your neural network and make predictions based on a given dataset.


  • 00:00:00 – Introduction
  • 00:02:20 – Where we left off with word2vec…
  • 00:03:35 – Example of losing context
  • 00:06:50 – Working at scale and adding attention
  • 00:12:34 – Multiple levels of training for the model
  • 00:14:10 – Attention is the basis for transformer models
  • 00:15:07 – BERT (Bidirectional Encoder Representations from Transformers)
  • 00:16:29 – GPT (Generative Pre-trained Transformer)
  • 00:19:08 – Video Course Spotlight
  • 00:20:08 – How far have we moved forward?
  • 00:20:41 – Access to GPT-2 via Hugging Face
  • 00:23:56 – How to access and use these models?
  • 00:30:42 – Cost of training GPT-3
  • 00:35:01 – Resources to practice and learn with BERT
  • 00:38:19 – GPT-3 and GitHub Copilot
  • 00:44:35 – DALL-E is a transformer
  • 00:46:13 – Help yourself to the show notes!
  • 00:49:19 – How can people follow your work?
  • 00:50:03 – Thanks and goodbye

Show Links:

Support the podcast & join our community of Pythonistas

128 episodes