This episode currently has no reviews.
Submit ReviewWhen it comes to building ML models, you want to make a model simple enough so that it can handle a wide range of real-world data on the one hand, but not too simple that it overgeneralizes or underfits the available data. In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define the terms Overfitting, Underfitting, Bias, Variance, and Bias/Variance Tradeoff, and explain how they relate to AI and why it’s important to know about them.
When it comes to building ML models, you want to make a model simple enough so that it can handle a wide range of real-world data on the one hand, but not too simple that it overgeneralizes or underfits the available data. In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define the terms Overfitting, Underfitting, Bias, Variance, and Bias/Variance Tradeoff, and explain how they relate to AI and why it's important to know about them.
Show Notes:
This episode currently has no reviews.
Submit ReviewThis episode could use a review! Have anything to say about it? Share your thoughts using the button below.
Submit Review