Shannon Entropy Calculator

Calculate information entropy: H(X) = -Σ p(x) log₂ p(x)

Input Mode

Probability Distribution

Enter probabilities separated by commas (e.g., 0.5, 0.25, 0.25)

Quick Examples:

Understanding Shannon Entropy

What is Shannon Entropy? Shannon entropy measures the average amount of information (or uncertainty) in a random variable. It's measured in bits when using log₂.

Formula: H(X) = -Σ p(x) log₂ p(x)

Interpretation:

Examples:

Applications: Data compression, cryptography, machine learning, natural language processing, physics, and more!