Decision Tree Visualizer

Overview

The Decision Tree Visualizer provides an interactive way to build and understand decision trees for classification tasks. Visualize how trees make splits based on information gain or Gini impurity, see decision boundaries in 2D space, and explore the tree structure. Adjust model complexity to observe the tradeoff between training accuracy and overfitting, making it an excellent tool for understanding one of machine learning’s most interpretable algorithms.

Open in new tab

Tips

  • Start with max depth of 2-3 to see simple, interpretable trees, then increase to observe how complexity grows
  • Watch how decision boundaries become more intricate with deeper trees - this shows the model fitting training data more closely
  • Compare training vs test accuracy as you increase depth - the gap indicates overfitting
  • Min samples per leaf acts as regularization - higher values create smoother decision boundaries
  • Information gain and Gini impurity usually produce similar results, but try both to see subtle differences
  • Generate new data to see how the tree structure adapts to different patterns
  • Notice how each split creates rectangular decision boundaries aligned with feature axes
  • Use this tool to build intuition about why decision trees are prone to overfitting without proper constraints