Overfitting vs Underfitting Demo

Visualize how model complexity affects bias, variance, and generalization

Data Generation

Model Configuration

Model Fit

Training vs Test Error

Model Performance

Polynomial Degree
-
Training MSE
-
Test MSE
-
Training R²
-
Test R²
-
Model Status
Not Trained

About This Demo

This interactive demo helps you understand the bias-variance tradeoff in machine learning. Adjust the polynomial degree to see how model complexity affects both training and test performance.

Underfitting (High Bias): Model too simple - poor performance on both training and test data.
Good Fit: Balanced complexity - good performance on both training and test data.
Overfitting (High Variance): Model too complex - excellent training performance but poor test performance.

Tips: Start with degree 1 to see underfitting, gradually increase to find the sweet spot, then push to degree 10+ to observe overfitting. Watch how the gap between training and test error grows!