logo
episode-header-image
Apr 2024
1h 5m

Physics-Informed Neural Networks (PINNs)...

Jousef Murad
About this episode

💻 Full tutorial:    • Physics-Informed Neural Networks (PIN...   Physics-Informed Neural Networks (PINNs) integrate known physical laws into neural network learning, particularly for solving differential equations. They embed these laws into the network's loss function, guiding the learning process beyond just data fitting. This integration helps the network predict solutions that are not only data-driven but also align with physical principles, making PINNs especially useful in fields like fluid dynamics and heat transfer. By blending data with established physics, PINNs offer more accurate and robust predictions, especially in data-scarce scenarios.
—————————————————————————————

🌎 Website: ⁠http://jousefmurad.com/⁠

🌎 Technical Marketing for Your Business: 

📥 Weekly free science insights newsletter: 

🐤 Follow me on Twitter: @jousefm2

📷 Follow me on Instagram: @jousefmrd

Up next
Feb 2025
MATLAB vs. Python vs. Julia: The Hidden Truths - Gareth Thomas | Podcast #147
🌎 More about Versionbay: https://www.versionbay.com/Connect with Gareth on LinkedIn: https://www.linkedin.com/in/g-thomas/In this episode, we sit down with Gareth Thomas, founder of VersionBay, to explore the critical role of software versioning in engineering and how companies ... Show More
32m 57s
Feb 2025
Smoothed Particle Hydrodynamics in the Industry - Andreas Bauer | Podcast #146
🌎 Mentioned CAE tool by Andreas: https://www.divecae.comAndreas Bauer, Design Engineer at Kessler & Co (https://kessler-co.com) — a leading manufacturer of drivetrain and powertrain components for heavy mobility vehicles — reveals how cloud-native SPH simulation software is acce ... Show More
28m 41s
Feb 2025
Combustion Simulation in CFD - Kelly Senecal | Podcast #145
🌎 Learn more: https://convergecfd.com/Kelly Senecal is a co-founder of Convergent Science, a global leader in Computational Fluid Dynamics (CFD) software that is revolutionizing the industry with its predictive CFD technology. With over 26 years of expertise in the field, Seneca ... Show More
50m 16s
Recommended Episodes
Aug 2017
[MINI] Recurrent Neural Networks
RNNs are a class of deep learning models designed to capture sequential behavior.  An RNN trains a set of weights which depend not just on new input but also on the previous state of the neural network.  This directed cycle allows the training phase to find solutions which rely o ... Show More
17m 6s
Apr 2023
The Power of Graph Neural Networks: Understanding the Future of AI - Part 2/2 (Ep.224)
In this episode of our podcast, we dive deep into the fascinating world of Graph Neural Networks. First, we explore Hierarchical Networks, which allow for the efficient representation and analysis of complex graph structures by breaking them down into smaller, more manageable com ... Show More
35m 32s
Apr 2023
The Power of Graph Neural Networks: Understanding the Future of AI - Part 1/2 (Ep.223)
In this episode, I explore the cutting-edge technology of graph neural networks (GNNs) and how they are revolutionizing the field of artificial intelligence. I break down the complex concepts behind GNNs and explain how they work by modeling the relationships between data points ... Show More
27m 40s
Apr 2023
Olaf Sporns on Network Neuroscience
The intersection between cutting-edge neuroscience and the emerging field of network science has been growing tremendously over the past decade. Olaf Sporns, editor of Network Neuroscience, and Distinguished Professor, Provost Professor of Department of Psychological and Brain Sc ... Show More
13m 5s
Aug 2023
Why Deep Networks and Brains Learn Similar Features with Sophia Sanborn - #644
Today we’re joined by Sophia Sanborn, a postdoctoral scholar at the University of California, Santa Barbara. In our conversation with Sophia, we explore the concept of universality between neural representations and deep neural networks, and how these principles of efficiency pro ... Show More
45m 15s
Dec 2023
SE Radio 594: Sean Moriarity on Deep Learning with Elixir and Axon
Sean Moriarity, creator of the Axon deep learning framework, co-creator of the Nx library, and author of Machine Learning in Elixir and Genetic Algorithms in Elixir, published by the Pragmatic Bookshelf, speaks with SE Radio host Gavin Henry about what deep learning (neural netwo ... Show More
57m 43s
Oct 2017
The Complexity of Learning Neural Networks
Over the past several years, we have seen many success stories in machine learning brought about by deep learning techniques. While the practical success of deep learning has been phenomenal, the formal guarantees have been lacking. Our current theoretical understanding of the ma ... Show More
38m 51s
Jan 2015
Easily Fooling Deep Neural Networks
My guest this week is Anh Nguyen, a PhD student at the University of Wyoming working in the Evolving AI lab. The episode discusses the paper Deep Neural Networks are Easily Fooled [pdf] by Anh Nguyen, Jason Yosinski, and Jeff Clune. It describes a process for creating images that ... Show More
28m 25s