logo
episode-header-image
Jan 2021
1h 21m

Trends in Natural Language Processing wi...

Sam Charrington
About this episode

Today we continue the 2020 AI Rewind series, joined by friend of the show Sameer Singh, an Assistant Professor in the Department of Computer Science at UC Irvine. 

We last spoke with Sameer at our Natural Language Processing office hours back at TWIMLfest, and was the perfect person to help us break down 2020 in NLP. Sameer tackles the review in 4 main categories, Massive Language Modeling, Fundamental Problems with Language Models, Practical Vulnerabilities with Language Models, and Evaluation. 

We also explore the impact of GPT-3 and Transformer models, the intersection of vision and language models, and the injection of causal thinking and modeling into language models, and much more.

The complete show notes for this episode can be found at twimlai.com/go/445.

Up next
Oct 7
Recurrence and Attention for Long-Context Transformers with Jacob Buckman - #750
Today, we're joined by Jacob Buckman, co-founder and CEO of Manifest AI to discuss achieving long context in transformers. We discuss the bottlenecks of scaling context length and recent techniques to overcome them, including windowed attention, grouped query attention, and laten ... Show More
57m 23s
Sep 30
The Decentralized Future of Private AI with Illia Polosukhin - #749
In this episode, Illia Polosukhin, a co-author of the seminal "Attention Is All You Need" paper and co-founder of Near AI, joins us to discuss his vision for building private, decentralized, and user-owned AI. Illia shares his unique journey from developing the Transformer archit ... Show More
1h 5m
Sep 23
Inside Nano Banana 🍌 and the Future of Vision-Language Models with Oliver Wang - #748
Today, we’re joined by Oliver Wang, principal scientist at Google DeepMind and tech lead for Gemini 2.5 Flash Image—better known by its code name, “Nano Banana.” We dive into the development and capabilities of this newly released frontier vision-language model, beginning with th ... Show More
1h 3m
Recommended Episodes
Jan 2023
Chat GPT - Podcast Trailer
https://www.solgood.org - Check out our Streaming Service for our full collection of audiobooks, podcasts, short stories, & 10 hour sounds for sleep and relaxation at our website Are you curious about the inner workings of a language model? Want to know how AI is revolutionizing ... Show More
1m 5s
Jul 2023
2451: How LingQ is Expanding Language Horizons With Tech
In today's episode of Tech Talks Daily, we are joined by Mark Kaufmann, the CEO of LingQ, an innovative language-learning app. LingQ has recently made headlines for announcing the addition of Swahili to their extensive catalogue of languages, raising their total to 42 languages o ... Show More
30m 21s
Jul 2023
#130 Mathew Lodge: The Future of Large Language Models in AI
Welcome to episode #130 of Eye on AI with Mathew Lodge. In this episode, we explore the world of reinforcement learning and code generation. Mathew Lodge, the CEO of Diffblue, shares insights into how reinforcement learning fuels generative AI. As we explore the intricacies of re ... Show More
49m 44s
Mar 2020
1140: Learn a New Language quickly With AI-powered Lingvist
I recently read a fascinating story about a particle physicist turned co-founder of an AI-based language learning app called Lingvist and felt compelled to find out more. Mait Müntel was part of the Higgs boson discovery team at CERN, where he was frustrated by being unable to sp ... Show More
23m 33s
Jan 2023
Chatting with ChatGPT: Pros and Cons of Advanced Language AI (Ep. 215)
In this episode, I'll be discussing the capabilities and limitations of ChatGPT, an advanced language AI model. I'll go over its power to understand and respond to natural language, and its applications in tasks such as language translation and text summarization. However, I'll a ... Show More
31m 27s
May 2023
Efficiently Retraining Language Models: How to Level Up Without Breaking the Bank (Ep. 227)
Get ready for an eye-opening episode! 🎙️ In our latest podcast episode, we dive deep into the world of LoRa (Low-Rank Adaptation) for large language models (LLMs). This groundbreaking technique is revolutionizing the way we approach language model training by leveraging low-rank ... Show More
33m 50s
Jun 2021
135. What can we learn about language from computers?
In this episode, you’ll learn about language learning from AI (Artificial Intelligence) linguist and English teacher, Yazzy Ares. In this podcast we’ll talk about the field of NLP (Natural Language Processing) and AI, and the difference between humans and computers when it comes ... Show More
46m 58s
Oct 2012
Can robots be made creative enough to invent their own language?
Luc Steels delivers the 2012 Simonyi lecture and asks can machines be creative enough to invent their own language? Professor Steels talks about some of his recent breakthrough experiments which have seen robots programmed to play language games and come up with novel concepts, w ... Show More
1h 22m