- #1
evinda
Gold Member
MHB
- 3,836
- 0
Hello! (Wave)
Let $X_1, X_2, \dots, X_n$ a random sample with $E(X_i)=\mu$, $Var(X_i)=\sigma^2 \forall i$. For $0<a<0.5$:
show that for any $k \in [0,1]$, the interval
$$\left( \overline{X}-z_{k=a} \frac{\sigma}{\sqrt{n}}, \overline{X}+z_{(1-k)=a} \frac{\sigma}{\sqrt{n}}\right)$$
is a 100(1-a)% confidence interval for the mean $\mu$.
How could we do this? (Thinking)
Let $X_1, X_2, \dots, X_n$ a random sample with $E(X_i)=\mu$, $Var(X_i)=\sigma^2 \forall i$. For $0<a<0.5$:
show that for any $k \in [0,1]$, the interval
$$\left( \overline{X}-z_{k=a} \frac{\sigma}{\sqrt{n}}, \overline{X}+z_{(1-k)=a} \frac{\sigma}{\sqrt{n}}\right)$$
is a 100(1-a)% confidence interval for the mean $\mu$.
How could we do this? (Thinking)