Writing Probability as an Expectation (10/365)

Expectation E(\cdot) is such an ubiquitous construct in probability that it’s worth looking at the various situations in which it appears. It’s one of those constructs that was so cheap to invent but pays off time and again.

Consider a random variable \theta and asking the following question.

\displaystyle  P(\theta \ge \epsilon) = ?

We can already write this as an expectation

\displaystyle  P(\theta \ge \epsilon) = EI(\theta \ge \epsilon)

where I(\theta \ge \epsilon) is the indicator function. If \epsilon > 0 we can make the following substitution

\displaystyle  EI(\theta \ge \epsilon) \le E\frac{\theta}{\epsilon} I(\theta \ge \epsilon)

By dividing \theta by \epsilon, the random variable takes on values \ge 1 whenever I(\theta \ge \epsilon) and thus we end up with the above inequality. Further, if \theta \ge 0 is a non-negative random variable we can continue

\displaystyle  P(\theta \ge \epsilon) \\ = EI(\theta \ge \epsilon) \\ \le E\frac{\theta}{\epsilon} I(\theta \ge \epsilon) \text{ if } \epsilon > 0\\ = \frac{1}{\epsilon} E\theta - \frac{1}{\epsilon} E\theta I(\theta < \epsilon) \\ \le \frac{1}{\epsilon} E\theta \text{ if } \theta \ge 0

This gives us one version of the Chebyshev’s inequality

\displaystyle  P(\theta \ge \epsilon) \ge E\frac{\theta}{\epsilon} \text{ when } \theta \ge 0 \text{ and } \epsilon > 0

### Example

As an example, I have computed the chebyshev approximations of the probabilities of a die taking on value greater than x for two different dies.

ghci> :l Stats.hs
ghci> let fair_die = replicate 6 (1/6 :: Rational)
ghci> let die2 = map (/21) [6,5..1 :: Rational]
ghci> let var = [1..6]
ghci> let error dist xs ys = e dist (zipWith (\x y -> (x-y)^2) xs ys)
ghci> 
ghci> print $ map (1-) (cumul fair_die)
  [1 % 1,5 % 6,2 % 3,1 % 2,1 % 3,1 % 6,0 % 1]

ghci> print $ map (e fair_die var /) var
  [7 % 2,7 % 4,7 % 6,7 % 8,7 % 10,7 % 12]

ghci> 
ghci> let error_fair = error fair_die (map (1-) (cumul fair_die)) $ map (e fair_die var /) var
ghci> let error_die2 = error die2 (map (1-) (cumul die2)) $ map (e die2 var /) var
ghci> error_fair > error_die2
  True

As you can see, the upper bound is pretty poor. But we see that the bound is closer when we consider probabilities closer to the tail. I’ll come back to this and a couple of other versions of the Chebyshev’s inequality at a later time.

This entry was posted in Uncategorized and tagged , . Bookmark the permalink.

2 Responses to Writing Probability as an Expectation (10/365)

  1. Pingback: What’s Real About Probability? (11/365) | Latent observations

  2. Pingback: Chebyshev’s Inequality For Bounding From Below? (12/365) | Latent observations

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s