Respuesta :
Answer:
a) [tex] E(\hat \theta_1) =\frac{1}{2} [E(X_1) +E(X_2)]= \frac{1}{2} [\mu + \mu] = \mu[/tex]
So then we conclude that [tex] \hat \theta_1[/tex] is an unbiased estimator of [tex]\mu[/tex]
[tex] E(\hat \theta_2) =\frac{1}{4} [E(X_1) +3E(X_2)]= \frac{1}{4} [\mu + 3\mu] = \mu[/tex]
So then we conclude that [tex] \hat \theta_2[/tex] is an unbiased estimator of [tex]\mu[/tex]
b) [tex] Var(\hat \theta_1) =\frac{1}{4} [\sigma^2 + \sigma^2 ] =\frac{\sigma^2}{2} [/tex]
[tex] Var(\hat \theta_2) =\frac{1}{16} [\sigma^2 + 9\sigma^2 ] =\frac{5\sigma^2}{8} [/tex]
Step-by-step explanation:
For this case we know that we have two random variables:
[tex] X_1 , X_2[/tex] both with mean [tex]\mu = \mu[/tex] and variance [tex] \sigma^2[/tex]
And we define the following estimators:
[tex] \hat \theta_1 = \frac{X_1 + X_2}{2}[/tex]
[tex] \hat \theta_2 = \frac{X_1 + 3X_2}{4}[/tex]
Part a
In order to see if both estimators are unbiased we need to proof if the expected value of the estimators are equal to the real value of the parameter:
[tex] E(\hat \theta_i) = \mu , i = 1,2 [/tex]
So let's find the expected values for each estimator:
[tex] E(\hat \theta_1) = E(\frac{X_1 +X_2}{2})[/tex]
Using properties of expected value we have this:
[tex] E(\hat \theta_1) =\frac{1}{2} [E(X_1) +E(X_2)]= \frac{1}{2} [\mu + \mu] = \mu[/tex]
So then we conclude that [tex] \hat \theta_1[/tex] is an unbiased estimator of [tex]\mu[/tex]
For the second estimator we have:
[tex]E(\hat \theta_2) = E(\frac{X_1 + 3X_2}{4})[/tex]
Using properties of expected value we have this:
[tex] E(\hat \theta_2) =\frac{1}{4} [E(X_1) +3E(X_2)]= \frac{1}{4} [\mu + 3\mu] = \mu[/tex]
So then we conclude that [tex] \hat \theta_2[/tex] is an unbiased estimator of [tex]\mu[/tex]
Part b
For the variance we need to remember this property: If a is a constant and X a random variable then:
[tex] Var(aX) = a^2 Var(X)[/tex]
For the first estimator we have:
[tex] Var(\hat \theta_1) = Var(\frac{X_1 +X_2}{2})[/tex]
[tex] Var(\hat \theta_1) =\frac{1}{4} Var(X_1 +X_2)=\frac{1}{4} [Var(X_1) + Var(X_2) + 2 Cov (X_1 , X_2)] [/tex]
Since both random variables are independent we know that [tex] Cov(X_1, X_2 ) = 0[/tex] so then we have:
[tex] Var(\hat \theta_1) =\frac{1}{4} [\sigma^2 + \sigma^2 ] =\frac{\sigma^2}{2} [/tex]
For the second estimator we have:
[tex] Var(\hat \theta_2) = Var(\frac{X_1 +3X_2}{4})[/tex]
[tex] Var(\hat \theta_2) =\frac{1}{16} Var(X_1 +3X_2)=\frac{1}{4} [Var(X_1) + Var(3X_2) + 2 Cov (X_1 , 3X_2)] [/tex]
Since both random variables are independent we know that [tex] Cov(X_1, X_2 ) = 0[/tex] so then we have:
[tex] Var(\hat \theta_2) =\frac{1}{16} [\sigma^2 + 9\sigma^2 ] =\frac{5\sigma^2}{8} [/tex]