logo
episode-header-image
Apr 2024
1h 5m

Physics-Informed Neural Networks (PINNs)...

Jousef Murad
About this episode

💻 Full tutorial:    • Physics-Informed Neural Networks (PIN...   Physics-Informed Neural Networks (PINNs) integrate known physical laws into neural network learning, particularly for solving differential equations. They embed these laws into the network's loss function, guiding the learning process beyond just data fitting. This integration helps the network predict solutions that are not only data-driven but also align with physical principles, making PINNs especially useful in fields like fluid dynamics and heat transfer. By blending data with established physics, PINNs offer more accurate and robust predictions, especially in data-scarce scenarios.
—————————————————————————————

🌎 Website: ⁠http://jousefmurad.com/⁠

🌎 Technical Marketing for Your Business: 

📥 Weekly free science insights newsletter: 

🐤 Follow me on Twitter: @jousefm2

📷 Follow me on Instagram: @jousefmrd

Up next
Aug 3
AI Agents in Engineering - Sam Bydlon and Ram Seetharaman | Podcast #151
AI agents are the next big thing in automation, but how do they actually work, and where do they fit into the broader landscape of AI?In this episode, we sit down with Sam Bydlon, a Senior Specialist Solutions Architect at AWS, and Ram Seetharaman, an AI Lead at Synera, to explor ... Show More
38m 40s
Aug 3
CFD & Wind Tunnel Testing in High-Performance Engineering - Adrian Villar Collazo | Podcast #150
CFD and wind tunnel testing are the backbone of high-performance engineering — but how do they complement each other, and what are the biggest challenges engineers face?In this episode, we sit down with Adrian Villar Collazo, Managing Director at VFluid Advanced Technologies and ... Show More
32m 45s
Aug 3
Automatic Differentiation and Adjoint Methods - Felix Köhler | Podcast #149
In this episode, we sit down with Felix, a YouTube educator and PhD researcher in computational science, to explore the intersection of engineering, AI, and simulation. We dive into his journey of growing a YouTube channel from scratch and discuss whether AI can truly replace phy ... Show More
1h 5m
Recommended Episodes
Aug 2017
[MINI] Recurrent Neural Networks
RNNs are a class of deep learning models designed to capture sequential behavior.  An RNN trains a set of weights which depend not just on new input but also on the previous state of the neural network.  This directed cycle allows the training phase to find solutions which rely o ... Show More
17m 6s
Apr 2023
The Power of Graph Neural Networks: Understanding the Future of AI - Part 2/2 (Ep.224)
In this episode of our podcast, we dive deep into the fascinating world of Graph Neural Networks. First, we explore Hierarchical Networks, which allow for the efficient representation and analysis of complex graph structures by breaking them down into smaller, more manageable com ... Show More
35m 32s
Apr 2023
The Power of Graph Neural Networks: Understanding the Future of AI - Part 1/2 (Ep.223)
In this episode, I explore the cutting-edge technology of graph neural networks (GNNs) and how they are revolutionizing the field of artificial intelligence. I break down the complex concepts behind GNNs and explain how they work by modeling the relationships between data points ... Show More
27m 40s
Apr 2023
Olaf Sporns on Network Neuroscience
The intersection between cutting-edge neuroscience and the emerging field of network science has been growing tremendously over the past decade. Olaf Sporns, editor of Network Neuroscience, and Distinguished Professor, Provost Professor of Department of Psychological and Brain Sc ... Show More
13m 5s
Aug 2023
Why Deep Networks and Brains Learn Similar Features with Sophia Sanborn - #644
Today we’re joined by Sophia Sanborn, a postdoctoral scholar at the University of California, Santa Barbara. In our conversation with Sophia, we explore the concept of universality between neural representations and deep neural networks, and how these principles of efficiency pro ... Show More
45m 15s
Dec 2023
SE Radio 594: Sean Moriarity on Deep Learning with Elixir and Axon
Sean Moriarity, creator of the Axon deep learning framework, co-creator of the Nx library, and author of Machine Learning in Elixir and Genetic Algorithms in Elixir, published by the Pragmatic Bookshelf, speaks with SE Radio host Gavin Henry about what deep learning (neural netwo ... Show More
57m 43s
Oct 2017
The Complexity of Learning Neural Networks
Over the past several years, we have seen many success stories in machine learning brought about by deep learning techniques. While the practical success of deep learning has been phenomenal, the formal guarantees have been lacking. Our current theoretical understanding of the ma ... Show More
38m 51s
Jan 2015
Easily Fooling Deep Neural Networks
My guest this week is Anh Nguyen, a PhD student at the University of Wyoming working in the Evolving AI lab. The episode discusses the paper Deep Neural Networks are Easily Fooled [pdf] by Anh Nguyen, Jason Yosinski, and Jeff Clune. It describes a process for creating images that ... Show More
28m 25s