About this episode
The logistic regression algorithm is used for classification tasks in supervised machine learning, distinguishing items by class (such as "expensive" or "not expensive") rather than predicting continuous numerical values. Logistic regression applies a sigmoid or logistic function to a linear regression model to generate probabilities, which are then used to assign class labels through a process involving hypothesis prediction, error evaluation with a log likelihood function, and parameter optimization using gradient descent.
Links
Classification versus Regression in Supervised Learning
- Supervised learning consists of two main tasks: regression and classification.
- Regression algorithms predict continuous values, while classification algorithms assign classes or categories to data points.
The Role and Nature of Logistic Regression
- Logistic regression is a classification algorithm, despite its historically confusing name.
- The algorithm determines the probability that an input belongs to a specific class, using outputs between zero and one.
How Logistic Regression Works
- The process starts by passing inputs through a linear regression function, then applying a logistic (sigmoid) function to produce a probability.
- For binary classification, results above 0.5 usually indicate a positive class (for example, "expensive"), and results below 0.5 indicate a negative class ("not expensive").
- Multiclass problems assign probabilities to each class, selecting the class with the highest probability using the arg max function.
Example Application: Housing Spreadsheet
- An example uses a spreadsheet of houses with features like square footage and number of bedrooms, labeling each as "expensive" (1) or "not expensive" (0).
- Logistic regression uses the spreadsheet data to learn the pattern that separates expensive houses from less expensive ones.
Steps in Logistic Regression
- The algorithm follows three steps: predict (infer a class), evaluate error (calculate how inaccurate the guesses were), and train (refine the underlying parameters).
- Predictions are compared to actual data, and the difference (error) is calculated via a log likelihood function, which accounts for how confident the prediction was compared to the true value.
- Model parameters (theta values) are updated using gradient descent, which iteratively reduces the error by adjusting these values based on the derivative of the error function.
The Mathematical Foundation
- The hypothesis function is the sigmoid or logistic function, with the formula: 1 / (1 + e^(-theta^T x)), where theta represents the parameters and x the input features.
- The error function (cost function) for logistic regression uses log likelihood, aggregating errors over all data points to guide model learning.
Practical Considerations
- Logistic regression finds a "decision boundary" on the graph (S-curve) that best separates classes such as "expensive" versus "not expensive."
- When the architecture requires a proper probability distribution (sum of probabilities equals one), a softmax function is applied to the outputs, but softmax is not covered in this episode.
Composability in Machine Learning
- Machine learning architectures are highly compositional, with functions nested within other functions - logistic regression itself is a function of linear regression.
- This composability underpins more complex systems like neural networks, where each "neuron" can be seen as a logistic regression unit powered by linear regression.
Building Toward Advanced Topics
- Understanding logistic and linear regression forms the foundation for approaching advanced areas of machine learning such as deep learning and neural networks.
- The concepts of prediction, error measurement, and iterative training recur in more sophisticated models.
Resource Recommendations
- The episode recommends the Andrew Ng Coursera course for deeper study into these concepts and details, especially for further exploration of multivariate regression and error functions.
Feb 2017
MLG 008 Math for Machine Learning
<div> <p>Mathematics essential for machine learning includes linear algebra, statistics, and calculus, each serving distinct purposes: linear algebra handles data representation and computation, statistics underpins the algorithms and evaluation, and calculus enables the optimiza ... Show More
28m 12s
Mar 2017
MLG 009 Deep Learning
<p><a href= "https://ocdevel.com/walk?utm_source=podcast&utm_medium=mlg&utm_campaign=mlg9"> Try a walking desk</a> to stay healthy while you study or work!</p> <p>Full notes at <a href= "htts://ocdevel.com/mlg/9?utm_source=podcast&utm_medium=mlg&utm_campaign=mlg9"> ocdevel.com/ml ... Show More
51m 28s
Mar 2017
MLG 010 Languages & Frameworks
<p><a href= "https://ocdevel.com/walk?utm_source=podcast&utm_medium=mlg&utm_campaign=mlg10"> Try a walking desk</a> to stay healthy while you study or work!</p> <p>Full notes at <a href= "https://ocdevel.com/mlg/10?utm_source=podcast&utm_medium=mlg&utm_campaign=mlg10"> ocdevel.c ... Show More
44m 36s
Apr 2017
Feature Processing for Text Analytics
It seems like every day there's more and more machine learning problems that involve learning on text data, but text itself makes for fairly lousy inputs to machine learning algorithms. That's why there are text vectorization algorithms, which re-format text data so it's ready f ... Show More
17m 28s
Jun 2020
Rust and machine learning #4: practical tools (Ep. 110)
<p>In this episode I make a non exhaustive list of machine learning tools and frameworks, written in Rust. Not all of them are mature enough for production environments. I believe that community effort can change this very quickly.</p>
<p>To make a comparison with the Python ecos ... Show More
24m 18s
Jul 2025
Revolutionizing Python Notebooks with Marimo
SummaryIn this episode of the Data Engineering Podcast Akshay Agrawal from Marimo discusses the innovative new Python notebook environment, which offers a reactive execution model, full Python integration, and built-in UI elements to enhance the interactive computing experience. ... Show More
51m 56s
Mar 2021
449: Fairness in A.I.
Ayodele Odubela joins us to discuss fairness in AI and how we can work towards a more equitable and transparent world of data science and machine learning.
In this episode you will learn:
Comet ML [3:22]
What is a data science evangelist? [7:08]
FullyConnected [12:04]
Impost ... Show More
59m 33s
Oct 11
Context Engineering as a Discipline: Building Governed AI Analytics
SummaryIn this episode of the Data Engineering Podcast, host Tobias Macey welcomes back Nick Schrock, CTO and founder of Dagster Labs, to discuss Compass - a Slack-native, agentic analytics system designed to keep data teams connected with business stakeholders. Nick shares his j ... Show More
51m 58s
May 2023
TinyML: Bringing machine learning to the edge
When we think about machine learning today we often think in terms of immense scale — large language models that require huge amounts of computational power, for example. But one of the most interesting innovations in machine learning right now is actually happening on a really s ... Show More
45m 45s
Dec 2024
140: What’s the Future of Data Analysts with AI?
<p>Help us become the #1 Data Podcast by leaving a rating & review! We are 67 reviews away! </p><p>Will AI replace data analysts? Let’s clear up the confusion and talk about what’s next.</p><p>💌 Join 30k+ aspiring data analysts & get my tips in your inbox weekly 👉 <a hr ... Show More
10m 58s
Jul 2023
AI Today Podcast: AI Glossary Series – Data Science Notebooks, Jupyter, Colab
In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define the terms Data Science Notebooks, Jupyter, Colab, explain how these terms relate to AI and why it’s important to know about them.
Show Notes:
FREE Intro to CPMAI mini course
CPMAI Training and C ... Show More
11 m
Oct 2021
AI Today Podcast: AI Education Series: Model Evaluation and Testing
This podcast episode provides a snippet of Cognilytica’s AI and ML education from our Cognilytica Education Subscription. When creating ML models it’s important to focus on model evaluation and testing and evaluating AI models vs. the heuristic. This podcast is an excerpt from ou ... Show More
18m 26s
Dec 2016
Ep. 2: Where Deep Learning Goes Next - Bryan Catanzaro, NVIDIA Applied Deep Learning Research
Bryan Catanzaro, vice president for applied deep learning research at NVIDIA, talks about how we know an AI technology is working, the potential for AI-powered speech, and where we’ll see the next deep learning breakthroughs.
32m 52s