Neural Network Playground
Build, train, and visualize neural networks interactively
Network Architecture
Number of Hidden Layers:
1
Neurons per Hidden Layer:
4
Activation Function:
ReLU
Tanh
Sigmoid
Learning Rate:
0.100
Dataset:
Concentric Circles
XOR Pattern
Spiral
Blobs
Training Epochs:
Build Network
Train Network
Reset Weights
Network Visualization
Download Data
Decision Boundary
Network Architecture
Training Metrics
Training Progress
Help
Neural Networks
Architecture:
Input layer → Hidden layers → Output layer
Forward Pass:
Data flows through network, computing activations
Backpropagation:
Gradients flow backward to update weights
Activation Functions:
Add non-linearity so networks can learn complex patterns
Training Tips
Start Simple:
Begin with one hidden layer, add more if needed
Learning Rate:
Too high causes instability, too low is slow
Layers vs Neurons:
More layers = more complex patterns; more neurons = more capacity
XOR Problem:
Classic example requiring hidden layers