Overfitting is when a machine learning model learns too much from its training data and fails to work well on new data.
Imagine you're trying to draw a picture of your favorite animal, let's say it's a cat, based only on what you see in the morning. If you look at the same cat every day, you might start drawing extra details like the exact pattern of fur or even the way its whiskers bend in the light. But if someone shows you a different cat later, your picture might not match it at all because you focused too much on the specific details of one cat.
Like Learning Too Many Tricks
Think of a model that learns too many tricks to pass a test. It memorizes every question and answer in its training set, but when new questions come up, it doesn’t know how to solve them, it's like having a friend who knows all the answers by heart but can't explain the math behind them.
A Simple Example
Say you're trying to guess your friend’s favorite number based on what they tell you. If they only give you one number, and you remember that exactly, you might get it right, but if they ask about another number next time, you might be stuck. That's like overfitting, learning too much from a small set of examples.
Examples
- A student memorizes all the answers to a test but can't solve new problems.
- A model learns every name in a class but can't recognize new students.
- A chef copies a recipe exactly but can’t adjust it for different ingredients.
Ask a question
See also
- How Does Machine Learning Explained in 100 Seconds Work?
- What is Machine learning?
- What are machine learning techniques?
- What are machine learning models?
- How AI really works (...it’s not actually intelligent)?