Markov and Chebyshev's inequality

Let \(S = \{H, T\}^n\).

Let \(E_i = \{s \in S \)  and either \(s_i = H\) or \( s_{i+1} = H\}\) for all \( 1 \leq i \leq n-1\). 

Let \(R: S \to \mathbb{R}\) be the random variable defined by \[R = \sum_{i=1}^{n-1}\mathbf{1}_{E_i} \] be the sum of the indicator variables of \(E_i\). 

(a) Briefly explain why \(\Pr [R = n-1] = F_{n+2}/2^n\). Use the fact that the size of  \(C = \{s|s\in S \text{ for each }i, s_i = H \text{ or } s_{i+1}=H \}\)  is \(F_{n+2}\).

If this looks familiar, it's because you've already done something similar in HW4

(b) Give with proof \(\mathbb{E} [R] \). 

\(S=\{H, T\}^n\)

\(R = \sum_{i=1}^{n-1}\mathbf{1}_{E_i}\)

\displaystyle \mathbb{E} [R] =\mathbb{E}\left[\sum_{i=1}^{n-1} \mathbf{1}_{E_i} \right]\\
= \sum_{i=1}^{n-1} \mathbb{E} [\mathbf{1}_{E_i}]

linearity of expectation 

(b) Give with proof \(\mathbb{E} [R] \). 

\(S=\{H, T\}^n\)

\(R = \sum_{i=1}^{n-1}\mathbf{1}_{E_i}\)

\displaystyle \mathbb{E} [R] =\mathbb{E}\left[\sum_{i=1}^{n-1} \mathbf{1}_{E_i} \right]\\
= \sum_{i=1}^{n-1} \textcolor{blue}{\mathbb{E} [\mathbf{1}_{E_i}]}

linearity of expectation 

\mathbb{E} [\mathbf{1}_{E_i}]
= 1\cdot\Pr[\mathbf{1}_{E_i}=1] \\+0 \cdot \Pr[\mathbf{1}_{E_i}=0]
= \Pr[\mathbf{1}_{E_i}=1]
= \sum_{i=1}^{n-1} \Pr[\mathbf{1}_{E_i}=1]

\(=\Pr [E_i]\)

the question remains to find \(\Pr[E_i]\)

(c) Obtain an upper bound on \(F_{n+2}\) using Markov's inequality. 

\(S=\{H, T\}^n\)

(a) Give with proof the value of \(\text{Var}[R]\) 

\(S=\{H, T\}^n\)

Made with Slides.com