logo
episode-header-image
Aug 2017
17m 6s

[MINI] Recurrent Neural Networks

Kyle Polich
About this episode

RNNs are a class of deep learning models designed to capture sequential behavior.  An RNN trains a set of weights which depend not just on new input but also on the previous state of the neural network.  This directed cycle allows the training phase to find solutions which rely on the state at a previous time, thus giving the network a form of memory.  RNNs have been used effectively in language analysis, translation, speech recognition, and many other tasks.

Up next
Nov 23
Designing Recommender Systems for Digital Humanities
<p>In this episode of Data Skeptic, we explore the fascinating intersection of recommender systems and digital humanities with guest Florian Atzenhofer-Baumgartner, a PhD student at Graz University of Technology. Florian is working on <a href= "http://monasterium.net/">Monasteriu ... Show More
36m 48s
Nov 13
DataRec Library for Reproducible in Recommend Systems
<p>In this episode of Data Skeptic's Recommender Systems series, host Kyle Polich explores DataRec, a new Python library designed to bring reproducibility and standardization to recommender systems research. Guest Alberto Carlo Maria Mancino, a postdoc researcher from Politecnico ... Show More
32m 48s
Nov 5
Shilling Attacks on Recommender Systems
In this episode of Data Skeptic's Recommender Systems series, Kyle sits down with Aditya Chichani, a senior machine learning engineer at Walmart, to explore the darker side of recommendation algorithms. The conversation centers on shilling attacks—a form of manipulation where mal ... Show More
34m 48s
Recommended Episodes
Apr 2023
The Power of Graph Neural Networks: Understanding the Future of AI - Part 1/2 (Ep.223)
<p>In this episode, I explore the cutting-edge technology of graph neural networks (GNNs) and how they are revolutionizing the field of artificial intelligence. I break down the complex concepts behind GNNs and explain how they work by modeling the relationships between data poin ... Show More
27m 40s
Aug 2023
Why Deep Networks and Brains Learn Similar Features with Sophia Sanborn - #644
Today we’re joined by Sophia Sanborn, a postdoctoral scholar at the University of California, Santa Barbara. In our conversation with Sophia, we explore the concept of universality between neural representations and deep neural networks, and how these principles of efficiency pro ... Show More
45m 15s
Apr 2024
Physics-Informed Neural Networks (PINNs) - Conor Daly | Podcast #120
💻 Full tutorial: • Physics-Informed Neural Networks (PIN... Physics-Informed Neural Networks (PINNs) integrate known physical laws into neural network learning, particularly for solving differential equations. They embed these laws into the network's loss function, guiding the l ... Show More
1h 5m
Apr 2023
The Power of Graph Neural Networks: Understanding the Future of AI - Part 2/2 (Ep.224)
<p>In this episode of our podcast, we dive deep into the fascinating world of Graph Neural Networks.</p> <p>First, we explore Hierarchical Networks, which allow for the efficient representation and analysis of complex graph structures by breaking them down into smaller, more mana ... Show More
35m 32s
Dec 2023
SE Radio 594: Sean Moriarity on Deep Learning with Elixir and Axon
<p><strong>Sean Moriarity</strong>, creator of the Axon deep learning framework, co-creator of the Nx library, and author of <em>Machine Learning in Elixir</em> and <em>Genetic Algorithms in Elixir,</em> published by the Pragmatic Bookshelf, speaks with SE Radio host <a href= "ht ... Show More
57m 43s
Feb 2020
Qu’est-ce que le Deep Learning ?
Le deep learning ou l’apprentissage profond est un type d'intelligence artificielle dérivé du machine learning qui signifie lui apprentissage automatique. Ici, la machine est capable d'apprendre par elle-même, contrairement à la programmation où elle se contente d'exécuter à la l ... Show More
5m 4s
May 2018
Practical Deep Learning with Rachel Thomas - TWiML Talk #138
In this episode, i'm joined by Rachel Thomas, founder and researcher at Fast AI. If you’re not familiar with Fast AI, the company offers a series of courses including Practical Deep Learning for Coders, Cutting Edge Deep Learning for Coders and Rachel’s Computational Linear Algeb ... Show More
44m 19s
Apr 2023
Olaf Sporns on Network Neuroscience
The intersection between cutting-edge neuroscience and the emerging field of network science has been growing tremendously over the past decade. Olaf Sporns, editor of Network Neuroscience, and Distinguished Professor, Provost Professor of Department of Psychological and Brain Sc ... Show More
13m 5s
Feb 2020
Networking Optimizations for Multi-Node Deep Learning on Kubernetes with Erez Cohen - #345
Today we conclude the KubeCon ‘19 series joined by Erez Cohen, VP of CloudX & AI at Mellanox, who we caught up with before his talk “Networking Optimizations for Multi-Node Deep Learning on Kubernetes.” In our conversation, we discuss NVIDIA’s recent acquisition of Mellanox, the ... Show More
31m 31s
Aug 2022
Is SmartNIC a game changer for network performance? | The Backend Engineering Show
<p>In this episode of the backend engineering show I go through the main job of the network interface controller (NIC for short) and how the datacenter is pushing it to the limit by allowing it to do more TCP/IP processing, creating what is being popularized as smartNIC.</p> <p>0 ... Show More
21m 23s