There are 2 fundamental quantities of probability distributions: expected value and variance.
Expected value:
- The simplest and most useful summary of the distribution of a random variable is the “average” of the values it takes on.
- (Please see references for equation)
Variance :
- The variance is a measure of how broadly distributed the r.v. tends to be.
- It’s defined as the expectation of the squared deviation from the mean:
- Var(X) = E[(X − E(X))2 ]
- In general terms, it is the expected squared distance of a value from the mean.
Looking at different distributions presents an interesting take on these two quantities:
- Bernoulli Distribution
- Uniform Distribution
- Geometric Distribution
- Binomial Distribution
- Normal Distribution
- Hypergeometric Distribution
- Poisson Distribution
References:
- http://idiom.ucsd.edu/~rlevy/teaching/fall2008/lign251/lectures/lecture_3.pdf
- http://www.dma.unifi.it/~modica/2009-10/an1/appendici-Cormen.pdf
- http://terras-altas.net.br/MA-2013/statistics/probability%20distribution%20functions/Examples%20of%20Bernouilli%20distribution.pdf
- This link has nice examples of several distributions
- http://people.umass.edu/biep540w/pdf/bernoulli.pdf
- http://www.math.uah.edu/stat/interval/Bernoulli.html
Video:
- https://www.khanacademy.org/math/statistics-probability/sampling-distributions-library/sample-proportions/v/mean-and-variance-of-bernoulli-distribution-example
- The 2 videos in this link discusses the mean and variance of a bernoulli distribution
Code:
[…] a previous post , I mentioned about expected value and variance of different […]