This episode currently has no reviews.
Submit ReviewData is the heart of AI. Which is why having good, clean data is so critical. But what happens when your data changes of over? What does that do to your models? In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer discuss the terms Data Drift, Model Drift, and Model Retraining.
Data is the heart of AI. Which is why having good, clean data is so critical. But what happens when your data changes of over? What does that do to your models? In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer discuss the terms Data Drift, Model Drift, and Model Retraining.
One term that is important to know in AI is Data drift. Also known as input drift, it is the characteristic that over time data that is used in a given system will change from its original characteristics and understandings to new characteristics. This means that, over time, even good quality data will decay with increasing errors, missing values, old values, and other aspects that lead to lower quality data.
Model drift, also known as model decay or prediction drift, is the characteristic that over time a given model that performs well against real-world data tends to perform worse. This can be a result of the real-world data and/or operational environment changing against the data under which the model was originally trained.
As a result, model retraining is needed. We discuss these terms in greater detail. And explain it at a level you need to know for AI project success.
Show Notes:
This episode currently has no reviews.
Submit ReviewThis episode could use a review! Have anything to say about it? Share your thoughts using the button below.
Submit Review