Decision Tree Visualizer

Interactive visualization of decision tree learning and classification

Model Configuration

3
5

Help

Decision Trees
  • How it works: Recursively split data based on features to create homogeneous groups
  • Gini Impurity: Measures probability of incorrect classification (lower is better)
  • Information Gain: Reduction in entropy after a split (higher is better)
  • Max Depth: Limits tree depth to prevent overfitting
Interpreting Results
  • Decision Boundaries: Rectangular regions where tree makes predictions
  • Tree Structure: Shows splits and leaf nodes with their predictions
  • Training vs Test: Large gap indicates overfitting
  • Min Samples: Higher values create smoother boundaries