There are 2 fundamental quantities of probability distributions: expected value and variance.
- The simplest and most useful summary of the distribution of a random variable is the “average” of the values it takes on.
- (Please see references for equation)
- The variance is a measure of how broadly distributed the r.v. tends to be.
- It’s defined as the expectation of the squared deviation from the mean:
- Var(X) = E[(X − E(X))2 ]
- In general terms, it is the expected squared distance of a value from the mean.
Looking at different distributions presents an interesting take on these two quantities:
- Bernoulli Distribution
- Uniform Distribution
- Geometric Distribution
- Binomial Distribution
- Normal Distribution
- Hypergeometric Distribution
- Poisson Distribution
- This link has nice examples of several distributions
- The 2 videos in this link discusses the mean and variance of a bernoulli distribution