Conditional probability for a random vector

In summary, the conditional probability P(X>1/2|Y>1/3) is calculated using the formula P(X>1/2∩Y>1/3)/P(Y>1/3). The numerator is found to be 1/2, while the denominator is calculated to be 73/324. The result of the entire calculation is 27/28, which is different from the previously obtained result of 27/29. The method used to calculate this conditional probability can be done either by finding the marginal distribution function or by directly integrating the joint density function. The choice of method depends on personal preference and the goal of preventing errors and making calculations easier to document.
  • #1
TheSodesa
224
7

Homework Statement


The probability density function for a random vector ##(X,Y)## is ##f(x,y) = 3x##, when ##0 < y< x < 1##. Calculate the conditional probability
[tex]P(X> \frac{1}{2} | Y > \frac{1}{3})[/tex]

Homework Equations



Conditional probability:
\begin{equation}
P(A | B) = \frac{P(A \cap B)}{P(B)}
\end{equation}

Random vector density function:
\begin{equation}
P(A) = P((X,Y) \in A) = \int \int_{A} f(x,y) dx dy = \int \int_{A \cap \Omega} f(x,y) dx dy
\end{equation}

Marginal density function(s):
\begin{equation}
f_1(x) = \int_{-\inf}^{\inf} f(x,y) dy\\
\text{ and }\\
f_2(y) = \int_{-\inf}^{\inf} f(x,y) dx
\end{equation}

The Attempt at a Solution



Here ##P(X> \frac{1}{2} | Y>\frac{1}{3}) = \frac{P(X> \frac{1}{2} \cap Y>\frac{1}{3})}{P(Y>\frac{1}{3})}##.

The numerator is a part of a triangle below the line ##y = x## (area A in the picture below), whereas the denominator's domain is the entire triangle (A + B). The reason is that presumably ##f(x,y) = 0## above the line ##y = x##.
area.png

Now we calculate the numerator and denominator separately:

\begin{align*}
P(X> \frac{1}{2} \cap Y>\frac{1}{3})
&= \int_{1/2}^{1} \int_{1/3}^{x} 3x dy dx\\
&= \int_{1/2}^{1} \left[ 3xy \right]_{1/3}^{x} dx\\
&= \int_{1/2}^{1} 3x^2 - x dx\\
&= \left[ x^3 - \frac{1}{2}x^2\right]_{1/2}^{1}\\
&= 1/2
\end{align*}

For the denominator we need the density function of the marginal distribution of ##Y##, since the probability is only observed from its perspective:

\begin{align*}
f_2(y) = \int_{y}^{1} 3x dx = \left[ \frac{3}{2}x^2 \right]_{y}^{1} = \frac{3}{2} - \frac{3}{2}y^2
\end{align*}

Then
\begin{align*}
P(Y > \frac{1}{3})
&= \int_{1/3}^{1} \int_{1/3}^{x} \frac{3}{2} - \frac{3}{2}y^2 dy dx\\
&= \int_{1/3}^{1} \left[ \frac{3}{2}y - \frac{1}{2}y^3 \right]_{1/3}^{x} dx\\
&= \int_{1/3}^{1} \frac{3}{2}x - \frac{1}{2}x^3 - \frac{1}{2} + \frac{1}{54} dx\\
&= \left[ \frac{3}{4}x^2 - \frac{1}{8}x^4 - \frac{26}{54}x \right]_{1/3}^{1}\\
&= \frac{3}{4} - \frac{1}{8} - \frac{26}{54} - \frac{1}{12} + \frac{1}{216} + \frac{26}{162}\\
&= 73/324
\end{align*}
Now it's obvious that if I calculate the assigned conditional probability, it will be greater than ##1##. Where did I go wrong with this one?
\begin{align*}
\end{align*}
 
Physics news on Phys.org
  • #2
TheSodesa said:
For the denominator we need the density function of the marginal distribution of ##Y##, since the probability is only observed from its perspective:
No, the calculation of the denominator ##P(B)## is exactly the same as that of the numerator, except that the lower outer integration limit is ##1/3## instead of ##1/2##.
 
  • #3
andrewkirk said:
No, the calculation of the denominator ##P(B)## is exactly the same as that of the numerator, except that the lower outer integration limit is ##1/3## instead of ##1/2##.

Yeah, that was it. The result (of the entire calculation, not just the denominator) was ##\frac{27}{28}##. The thing is, I tried doing that at first, but must have made a mistake doing it somewhere I couldn't spot, because the answer I got then was ##\frac{27}{29}##.

The bigger problem now is that the electronic hand-in system of my school, in giving out a hint on how to solve this, suggested that I calculate it the way I suggested in the OP. Would it be possible to do it using a marginal distribution function for the denominator? When does one actually need to calculate the marginal distribution? My course handout is skimping out on practical examples...
 
Last edited:
  • #4
TheSodesa said:
Yeah, that was it. The result (of the entire calculation, not just the denominator) was ##\frac{27}{28}##. The thing is, I tried doing that at first, but must have made a mistake doing it somewhere I couldn't spot, because the answer I got then was ##\frac{27}{29}##.

The bigger problem now is that the electronic hand-in system of my school, in giving out a hint on how to solve this, suggested that I calculate it the way I suggested in the OP. Would it be possible to do it using a marginal distribution function for the denominator? When does one actually need to calculate the marginal distribution? My course handout is skimping out on practical examples...

When you compute the marginal density ##f_X(x)## you are already integrating ##f_{X,Y}(x,y)## with respect to ##y##. Then when you find ##P(a < X < b)## by integrating ##f_X(x)## over ##a \leq x \leq b##; you really are doing a 2-dimensional integration of ##f_{X,Y}(x.y)## over some ##(x,y)##-region. In essence, you are doing the same work either way.

However, what matters is how to prevent errors and to make calculations clear and easily documented. I, personally, find it cleaner to first compute ##f_X## and then do an ##f_X##-integral, but others prefer the opposite method. Take your choice and use whatever works best for you.
 
  • Like
Likes TheSodesa

FAQ: Conditional probability for a random vector

1. What is conditional probability for a random vector?

Conditional probability for a random vector is a measure of the likelihood of a certain event occurring, given that another event has already happened. It is used to calculate the probability of a specific outcome within a set of outcomes, given that certain conditions are met.

2. How is conditional probability for a random vector calculated?

Conditional probability for a random vector is calculated by dividing the joint probability of two events by the probability of the condition event. In mathematical terms, it can be written as P(A|B) = P(A and B) / P(B).

3. What is the difference between conditional probability and unconditional probability?

The main difference between conditional probability and unconditional probability is that conditional probability takes into account a specific condition or event, while unconditional probability does not consider any conditions. Conditional probability also modifies the probability of an event based on new information, while unconditional probability remains constant regardless of new information.

4. How is conditional probability used in real life?

Conditional probability is used in various fields, including statistics, finance, and science. For example, it can be used to calculate the probability of a disease given a certain symptom, or the likelihood of a stock price increasing given a specific economic condition. It is also used in machine learning and data analysis to make predictions and decisions based on past data.

5. Can conditional probability be greater than 1?

No, conditional probability cannot be greater than 1. This is because it is a measure of the likelihood of an event occurring, and a probability cannot be greater than 1 (or 100%). If the calculated conditional probability is greater than 1, it is most likely due to an error in the calculations.

Back
Top