Deviation: Markov & Chebyshev Bounds
The main focus in this section is finding the probability that an estimate deviates by a given amount from its expected value.
The mean, or, the expectation, is not always good enough to predict a random variable's behavior.
When a random variable takes a value much larger than its mean, its probability can be estimated with Markov's theorem.
If
The example given in the lectures is the disputable IQ score. The average IQ is said to be 100, and this means that only a
Because if it were more than
So, the probability of having an IQ of 300 is less than or equal to the expectation divided by 300:
The average (the expectation) is 100, so:
As the factor of expectation increases, the probability decreases proportionally.
If we are given that IQ score is always more than or equal to 50, we can get a better bound by adjusting the formula a little, namely, shifting
So, in that case, it would look like:
Another example is from the Rice University's lecture notes:
Assume the expected time it takes Algorithm A to traverse a graph with
nodes is . What is the probability that the algorithm takes more than 10 times that?
Since we're looking for the outcome that is 10 times more than the expectation:
So, the probability is not more than
Another example from the practice exercises:
Given a random variable
with , give the smallest value such that the Markov Bound guarantees if is nonnegative.
Well, in this case, we have
The Chebyshev bound says that the probability that the distance between
The term variance stands out, it is the expectation of the square of the distance between
Let's call it
So, the Chebyshev bound is:
The square root of variance is the standard deviation, denoted by
Now, let