(1) Nur die \( X_{i,1} \) und \( X_{i,2} \) haben den gleichen Erwartungswert. Deshalb lautet die Likelihood Funktion
$$ L(\mu_i) = \frac{1}{ \sqrt{2 \pi } \sigma } e^{ -\frac{1}{2} \left( \frac{x_{i,1} - \mu_i } { \sigma} \right)^2 } \frac{1}{ \sqrt{2 \pi } \sigma } e^{ -\frac{1}{2} \left( \frac{x_{i,2} - \mu_i } { \sigma} \right)^2 } $$ und daraus folgt
$$ l(\mu_i) = \ln(L(\mu_i)) = \ln \left( \frac{1}{ \sqrt{2 \pi} \sigma} \right)^2 - \frac{1}{2} \left( \frac{x_{i,1} - \mu_i } { \sigma} \right)^2 - \frac{1}{2} \left( \frac{x_{i,2} - \mu_i } { \sigma} \right)^2 $$ und daraus
$$ \frac{\partial}{\partial \mu_i} l(\mu_i) = \frac{x_{i,1} - \mu_i } { \sigma} + \frac{x_{i,2} - \mu_i } { \sigma} \overset{!}{=} 0 $$ Also
$$ \hat \mu_i = \frac{1}{2} ( x_{i,1} + x_{i,2} )$$
(2) Alle \( X_{i,j} \) haben dieselbe Varianz \( \sigma^2 \), deshalb tragen alle \( X_{i,j} \) zur Schätzung von \( \sigma^2 \) bei. Die Likelihood Funktion ist deshalb
$$ L(\sigma) = \prod_{i=1}^n \frac{1}{ \sqrt{2 \pi } \sigma } e^{ -\frac{1}{2} \left( \frac{x_{i,1} - \hat \mu_i } { \sigma} \right)^2 } \prod_{i=1}^n \frac{1}{ \sqrt{2 \pi } \sigma } e^{ -\frac{1}{2} \left( \frac{x_{i,2} - \hat \mu_i } { \sigma} \right)^2 } = \\ \left(\frac{1}{ \sqrt{2 \pi } \sigma } \right)^{2n} e^{ -\frac{1}{2} \sum_{i=1}^n \left( \frac{x_{i,1} - \hat \mu_i } { \sigma} \right)^2 } e^{ -\frac{1}{2} \sum_{i=1}^n \left( \frac{x_{i,2} - \hat \mu_i } { \sigma} \right)^2 } $$ Daraus folgt
$$ l(\sigma) = \ln(L(\sigma)) = -2n \ln(\sqrt{2\pi}) -2n \ln(\sigma) -\frac {1}{2} \sum_{i=1}^n \left( \frac{x_{i,1} - \hat \mu_i } { \sigma} \right)^2 -\frac {1}{2} \sum_{i=1}^n \left( \frac{x_{i,2} - \hat \mu_i } { \sigma} \right)^2 $$
Und deshalb
$$ \frac{\partial}{\partial \sigma} l(\sigma) = -\frac{2n}{\sigma} + \frac{1}{\sigma^3} \sum_{i=1}^n \left( x_{i,1} - \hat \mu_i \right)^2 + \frac{1}{\sigma^3} \sum_{i=1}^n \left( x_{i,2} - \hat \mu_i \right)^2 \overset{!}{=} 0 $$
Daraus folgt $$ \hat \sigma^2 = \frac{\sum_{i=1}^n \left( x_{i,1} - \hat \mu_i \right)^2 + \sum_{i=1}^n \left( x_{i,2} - \hat \mu_i \right)^2}{2n} $$
(3) $$ (a) \quad \sum_{i=1}^n (x_{i,1} - \hat \mu_i)^2 = \sum_{i=1}^n \left[ (x_{i,1} - \mu_i) - (\hat \mu_i - \mu_i) \right]^2 = \\ \sum_{i=1}^n (x_{i,1} - \mu_i)^2 - 2 \sum_{i=1}^n (x_{i,1}-\mu_i)(\hat\mu_i- \mu_i) + \sum_{i=1}^n (\hat\mu_i - \mu_1)^2 $$ und
$$ (b) \quad \sum_{i=1}^n (x_{i,2} - \hat \mu_i)^2 = \sum_{i=1}^n \left[ (x_{i,2} - \mu_i) - (\hat \mu_i - \mu_i) \right]^2 = \\ \sum_{i=1}^n (x_{i,2} - \mu_i)^2 - 2 \sum_{i=1}^n (x_{i,2}-\mu_i)(\hat\mu_i- \mu_i) + \sum_{i=1}^n (\hat\mu_i - \mu_1)^2 $$
Erwartungswertbildung führt bei (a) zu $$ n \sigma^2 - 2 \sum_{i=1}^n \mathbb{E} \left[ (x_{i,1}-\mu_i)(\hat\mu_i- \mu_i) \right]+\frac{n}{2}\sigma^2 $$ und bei (b) zu
$$ n \sigma^2 - 2 \sum_{i=1}^n \mathbb{E} \left [ (x_{i,2}-\mu_i)(\hat\mu_i- \mu_i) \right]+\frac{n}{2}\sigma^2$$
Addition der letzten beiden Gleichungen führt zu
$$ \mathbb{E} \left[ \sum_{i=1}^n (x_{i,1} - \hat \mu_i)^2 + \sum_{i=1}^n (x_{i,2} - \hat \mu_i)^2\right] = \\ 3n \sigma^3 - 2 \sum_{i=1}^n \mathbb{E} \left[ (x_{i,1}-\mu_i)(\hat\mu_i- \mu_i) \right] - 2 \sum_{i=1}^n \mathbb{E} \left[ (x_{i,2}-\mu_i)(\hat\mu_i- \mu_i) \right] = \\3n \sigma^2 - 4 \sum_{i=1}^n \mathbb{E} ( \hat \mu_i - \mu_i)^2 = 3n \sigma^2 - 4 \frac{n}{2} \sigma^2 = n \sigma^2 $$ und damit schlussendlich
$$ \mathbb{E} (\hat \sigma^2) = \mathbb{E} \left[ \frac{ \sum_{i=1}^n (x_{i,1} - \hat \mu_i)^2 + \sum_{i=1}^n (x_{i,2} - \hat \mu_i)^2 } {2n} \right]= \frac{\sigma^2}{2}$$