DistillPrep
Python
GenAI
GenAI Frameworks
NLP
Deep Learning
Machine Learning
ML Libraries
Statistics
SQL
MLOps
Cloud
System Design
Blog
Learn
Practice
Test
Live Engine
Select Topic
Descriptive Statistics
(15)
Probability Fundamentals
(15)
Probability Distributions
(15)
Central Limit Theorem
(15)
Hypothesis Testing
(15)
P Values And Significance
(15)
Confidence Intervals
(15)
Bayesian Statistics
(15)
Ab Testing
(15)
Correlation And Regression Stats
(15)
Information Theory Basics
(15)
Time Series Statistics
(15)
Statistical Pitfalls
(15)
easy
Information Theory Basics
A fair six-sided die is rolled. What is the Shannon entropy of the outcome in bits?
A
1 bit — there are two possible outcomes in a binary system
B
log₂(6) ≈ 2.58 bits — entropy of a uniform distribution over 6 outcomes
C
6 bits — one bit per face of the die
D
0 bits — each face is equally likely so there is no uncertainty
Previous
Back
Next