logo
episode-header-image
Apr 2024
1h 5m

Physics-Informed Neural Networks (PINNs)...

Jousef Murad
About this episode

💻 Full tutorial:    • Physics-Informed Neural Networks (PIN...   Physics-Informed Neural Networks (PINNs) integrate known physical laws into neural network learning, particularly for solving differential equations. They embed these laws into the network's loss function, guiding the learning process beyond just data fitting. This integration helps the network predict solutions that are not only data-driven but also align with physical principles, making PINNs especially useful in fields like fluid dynamics and heat transfer. By blending data with established physics, PINNs offer more accurate and robust predictions, especially in data-scarce scenarios.
—————————————————————————————

🌎 Website: ⁠http://jousefmurad.com/⁠

🌎 Technical Marketing for Your Business: 

📥 Weekly free science insights newsletter: 

🐤 Follow me on Twitter: @jousefm2

📷 Follow me on Instagram: @jousefmrd

Up next
Sep 11
CFD Visualization Best Practices - Stephan Groß | Podcast #158
Explore Siemens Simcenter: https://plm.sw.siemens.com/en-US/simcenterConnect with Stephan on LinkedIn: https://www.linkedin.com/in/s-gross/In this episode, Stephan Groß, Technical Marketing Engineer at Siemens, shares his unique journey from working on race car aerodynamics to be ... Show More
49m 12s
Sep 11
Siemens’ Digital Thread: Connecting Design & Simulation - Bob Ransijn | Podcast #157
Explore Simcenter Systems simulation: https://plm.sw.siemens.com/en-US/simcenter/systems-simulation/Connect with Bob on LinkedIn: https://www.linkedin.com/in/bobransijn/In this insightful conversation, we sit down with Bob, a seasoned System Simulation Specialist and Presales Man ... Show More
25m 40s
Sep 11
Smoothed Particle Hydrodynamics, EV Trends & AI in CFD - Martin Sonntag | Podcast #156
Explore shonDynamics: https://shondynamics.deConnect with Martin on LinkedIn: https://www.linkedin.com/in/martin-sonntag-7081a6127/From its early beginnings to a global presence, join CTO of shonDynamics, Martin Sonntag as he shares the story of how shonDynamics was built and sca ... Show More
30m 40s
Recommended Episodes
Aug 2017
[MINI] Recurrent Neural Networks
RNNs are a class of deep learning models designed to capture sequential behavior.  An RNN trains a set of weights which depend not just on new input but also on the previous state of the neural network.  This directed cycle allows the training phase to find solutions which rely o ... Show More
17m 6s
Apr 2023
The Power of Graph Neural Networks: Understanding the Future of AI - Part 2/2 (Ep.224)
In this episode of our podcast, we dive deep into the fascinating world of Graph Neural Networks. First, we explore Hierarchical Networks, which allow for the efficient representation and analysis of complex graph structures by breaking them down into smaller, more manageable com ... Show More
35m 32s
Apr 2023
The Power of Graph Neural Networks: Understanding the Future of AI - Part 1/2 (Ep.223)
In this episode, I explore the cutting-edge technology of graph neural networks (GNNs) and how they are revolutionizing the field of artificial intelligence. I break down the complex concepts behind GNNs and explain how they work by modeling the relationships between data points ... Show More
27m 40s
Apr 2023
Olaf Sporns on Network Neuroscience
The intersection between cutting-edge neuroscience and the emerging field of network science has been growing tremendously over the past decade. Olaf Sporns, editor of Network Neuroscience, and Distinguished Professor, Provost Professor of Department of Psychological and Brain Sc ... Show More
13m 5s
Aug 2023
Why Deep Networks and Brains Learn Similar Features with Sophia Sanborn - #644
Today we’re joined by Sophia Sanborn, a postdoctoral scholar at the University of California, Santa Barbara. In our conversation with Sophia, we explore the concept of universality between neural representations and deep neural networks, and how these principles of efficiency pro ... Show More
45m 15s
Dec 2023
SE Radio 594: Sean Moriarity on Deep Learning with Elixir and Axon
Sean Moriarity, creator of the Axon deep learning framework, co-creator of the Nx library, and author of Machine Learning in Elixir and Genetic Algorithms in Elixir, published by the Pragmatic Bookshelf, speaks with SE Radio host Gavin Henry about what deep learning (neural netwo ... Show More
57m 43s
Oct 2017
The Complexity of Learning Neural Networks
Over the past several years, we have seen many success stories in machine learning brought about by deep learning techniques. While the practical success of deep learning has been phenomenal, the formal guarantees have been lacking. Our current theoretical understanding of the ma ... Show More
38m 51s
Jan 2015
Easily Fooling Deep Neural Networks
My guest this week is Anh Nguyen, a PhD student at the University of Wyoming working in the Evolving AI lab. The episode discusses the paper Deep Neural Networks are Easily Fooled [pdf] by Anh Nguyen, Jason Yosinski, and Jeff Clune. It describes a process for creating images that ... Show More
28m 25s