- #1
EnzoF61
- 14
- 0
If [tex]\overline{_}X1[/tex] and [tex]\overline{_}X2[/tex] are the means of independent random samples of size n1 and n2 from a normal population with the mean [tex]\mu[/tex] and [tex]\sigma^2[/tex], show that the variance of the unbiased estimator Var([tex]\omega[/tex][tex]\overline{_}X1[/tex] +(1-[tex]\omega[/tex])[tex]\overline{_}X2[/tex]) is a minimum when [tex]\omega[/tex]= n1 / (n1 + n2).
My professor gave a hint to find the Var([tex]\omega[/tex][tex]\overline{_}X1[/tex] +(1-[tex]\omega[/tex])[tex]\overline{_}X2[/tex]) and then to minimize with respect to [tex]\omega[/tex] using first and second derivatives.
I understand to square out the coefficients due to independence but I'm not sure where to begin to find the variances of the means of independent random samples. I feel I should have an understanding to minimize with the Cramer-Rao Inequality. Maybe I'm trying to look too deep into what is being asked?
My professor gave a hint to find the Var([tex]\omega[/tex][tex]\overline{_}X1[/tex] +(1-[tex]\omega[/tex])[tex]\overline{_}X2[/tex]) and then to minimize with respect to [tex]\omega[/tex] using first and second derivatives.
I understand to square out the coefficients due to independence but I'm not sure where to begin to find the variances of the means of independent random samples. I feel I should have an understanding to minimize with the Cramer-Rao Inequality. Maybe I'm trying to look too deep into what is being asked?