Calculate information entropy: H(X) = -Σ p(x) log₂ p(x)
Input Mode
Probability Distribution
Enter probabilities separated by commas (e.g., 0.5, 0.25, 0.25)
Quick Examples:
Text Analysis
Enter text to analyze character frequency and calculate entropy
Results
Shannon Entropy
-
Entropy (bits)-
Maximum Entropy-
Efficiency-
Number of Symbols-
Detailed Calculations
Formula: H(X) = -Σ p(xi) × log₂ p(xi)
Probability Distribution
Understanding Shannon Entropy
What is Shannon Entropy? Shannon entropy measures the average amount of information (or uncertainty) in a random variable. It's measured in bits when using log₂.
Formula: H(X) = -Σ p(x) log₂ p(x)
Interpretation:
Higher entropy (closer to maximum): More uncertainty, more uniform distribution, more information per symbol
Lower entropy: Less uncertainty, more predictable, some symbols are much more likely than others
Zero entropy: Complete certainty, only one outcome is possible
Maximum entropy: Complete uncertainty, all outcomes equally likely (uniform distribution)
Examples:
Fair coin (0.5, 0.5): H = 1 bit (maximum for 2 outcomes)
Biased coin (0.9, 0.1): H ≈ 0.47 bits (more predictable)
Certain event (1.0): H = 0 bits (no uncertainty)
Fair 4-sided die: H = 2 bits (maximum for 4 outcomes)
Applications: Data compression, cryptography, machine learning, natural language processing, physics, and more!