logo
episode-header-image
Jun 14
17m 49s

Networks and Complexity

Kyle Polich
About this episode

In this episode, Kyle does an overview of the intersection of graph theory and computational complexity theory.  In complexity theory, we are about the runtime of an algorithm based on its input size.  For many graph problems, the interesting questions we want to ask take longer and longer to answer!  This episode provides the fundamental vocabulary and signposts along the path of exploring the intersection of graph theory and computational complexity theory.

Up next
Aug 17
Networks and Recommender Systems
Kyle reveals the next season's topic will be "Recommender Systems". Asaf shares insights on how network science contributes to the recommender system field. 
17m 45s
Jul 21
Network of Past Guests Collaborations
Kyle and Asaf discuss a project in which we link former guests of the podcast based on their co-authorship of academic papers. 
34m 10s
Jul 6
The Network Diversion Problem
In this episode, Professor Pål Grønås Drange from the University of Bergen, introduces the field of Parameterized Complexity - a powerful framework for tackling hard computational problems by focusing on specific structural aspects of the input. This framework allows researchers ... Show More
46m 14s
Recommended Episodes
Nov 2024
Mastering Algorithms: From Binary Search Trees to Dynamic Programming and Greedy Strategies
In this episode, we explore foundational algorithms and data structures that every developer and computer science enthusiast should know. Covering everything from Binary Search Trees (BSTs) to advanced concepts like Dynamic Programming and Greedy Algorithms, this episode is packe ... Show More
28m 3s
Nov 2024
BI 199 Hessam Akhlaghpour: Natural Universal Computation
Support the show to get full episodes and join the Discord community. Hessam Akhlaghpour is a postdoctoral researcher at Rockefeller University in the Maimon lab. His experimental work is in fly neuroscience mostly studying spatial memories in fruit flies. However, we are going t ... Show More
1h 49m
Jul 2019
41: Reality Is More Than Complex (Group Theory and Physics)
Children who are being taught mathematics often balk at the idea of negative numbers, thinking them to be fictional entities, and often only learn later that they are useful for expressing opposite extremes of things, such as considering a debt an amount of money with a negative ... Show More
54m 50s
Jan 2022
P12: O My God (Big O Notation)
There are times in mathematics when we are generalizing the behavior of many different, but similar, entities. One such time that this happens is the use cases of Big O notation, which include describing the long-term behavior of functions, and talking about how accurate numerica ... Show More
22m 54s
Jun 21
On the philosophy of simplification in computational neuroscience - with Mazviita Chirimuuta and Terrence Sejnowski - #29
Computational neuroscientists rely on simplification when they make their models. But what is the right level of simplification? When should we, for example, use a biophysically detailed model and when a simplified abstract model when modelling neural dynamics? What are the probl ... Show More
1h 24m
Jul 2024
#27 - Sean Carroll - The Enigma of Complexity
Is complexity the new frontier of physics? How should we approach metaphysical uncertainty? What makes a great Physicist? These are just some of the questions covered in this Win-Win episode with the incredible Sean Carroll. Sean is a theoretical physicist and philosopher who spe ... Show More
1h 58m
Mar 2023
Once Upon an Algorithm: How Stories Explain Computing
In this episode, Martin Erwig show us how we can find computational concepts inside some of our favorite stories.Picture a computer scientist, staring at a screen and clicking away frantically on a keyboard, hacking into a system, or perhaps developing an app. Now delete that pic ... Show More
16m 41s
Mar 2024
Venkatesh Rao: Protocols, Intelligence, and Scaling
“There is this move from generality in a relative sense of ‘we are not as specialized as insects’ to generality in the sense of omnipotent, omniscient, godlike capabilities. And I think there's something very dangerous that happens there, which is you start thinking of the word ‘ ... Show More
2h 18m
Nov 2024
AI and the Future of Math, with DeepMind’s AlphaProof Team
In this week’s episode of No Priors, Sarah and Elad sit down with the Google DeepMind team behind AlphaProof, a new reinforcement learning-based system for formal math reasoning that recently reached a silver-medal standard in solving International Mathematical Olympiad problems. ... Show More
39m 21s
Jul 2017
14: Artificial Thought (Neural Networks)
Go to www.brilliant.org/breakingmathpodcast to learn neural networks, everyday physics, computer science fundamentals, the joy of problem solving, and many related topics in science, technology, engineering, and math.  Mathematics takes inspiration from all forms with which life ... Show More
1h 5m