- #1
brendan_foo
- 65
- 0
Homework Statement
This was a midterm question that took some people a rapid time to solve, hence I can only imagine either I was needlessly long winded or something else occurred.
Hypothesis 1: [tex]S = 2 + \omega[/tex]
Hypothesis 2: [tex]S = -2 + \omega[/tex]
[tex]\omega \sim \mathcal{N}(0,\sigma^2_{w})[/tex]
Homework Equations
The following observations are made:
[tex]N_1 \sim \mathcal{N}(0,\sigma^2_1)[/tex]
[tex]N_2 \sim \mathcal{N}(0,\sigma^2_2)[/tex]
Formulate the likelihood ratio test
The Attempt at a Solution
Clearly, the variable [itex]S[/itex] has its conditional PDF shifted about the respective mean, and so I yield the likelihood ratio as follows:
[tex]
\vec{r} \triangleq
\begin{pmatrix}
r_1\\
r_2
\end{pmatrix}
[/tex]
[tex]
\Lambda(\vec{r}) = \exp(-\frac{1}{2}((\vec{r} - \vec{\mu_{H1}})^{T} \mathbb{R}^{-1}(\vec{r} - \vec{\mu_{H1}}) - (\vec{r} - \vec{\mu_{H0}})^{T} \mathbb{R}^{-1}(\vec{r} - \vec{\mu_{H0}})))
[/tex]
Where
[tex]
\vec{\mu_{H1}} =
\begin{pmatrix}
2\\
0
\end{pmatrix}
[/tex]
[tex]
\vec{\mu_{H0}} =
\begin{pmatrix}
-2\\
0
\end{pmatrix}
[/tex]
and
[tex]
R =
\begin{bmatrix}
\sigma_1^2 + \sigma_{w}^2 & \sigma_1^2\\
\sigma_1^2 & \sigma_{1}^2 + \sigma_2^2
\end{bmatrix}
[/tex]Clearly the issue here becomes a matter of algebraic reduction after taking the natural logarithm. I have omitted any 'threshold' on the RHS (Bayes criterion for example).
Is this formulation correct thus far? Some people mentioned no need for matrix inversions of any type...Can this problem be simplified or have I needlessly made things difficult for myself?
With thanks,
:)