Is Marginalization Always Valid for Joint Probabilities?

  • MHB
  • Thread starter tmt1
  • Start date
  • Tags
    Probability
In summary, given $P(B \land C)$, it will always be true that $P(B \land C \mid A)P(A) + P(B \land C \mid \lnot A)P(\lnot A)$, regardless of the value of $P(A)$. This can be proven using the sum and product rules of set theory.
  • #1
tmt1
234
0
Given $$P(B \land C)$$ will it always be true that $$P(B \land C | A) P(A) + P(B \land C | \lnot A) P( \lnot A)$$ (regardless what $P(A)$ would be)?

How can I prove this?
 
Mathematics news on Phys.org
  • #2
tmt said:
Given $$P(B \land C)$$ will it always be true that $$P(B \land C | A) P(A) + P(B \land C | \lnot A) P( \lnot A)$$ (regardless what $P(A)$ would be)?

How can I prove this?

Hi tmt, (Smile)

We have from set theory:
$$P(B \land C) = P(B \land C \land (A \lor \lnot A))
= P((B \land C \land A) \lor (B \land C \land \lnot A))
$$
Since $A$ and $\lnot A$ are mutually exclusive, as are subsets of them, it follows from the sum rule that:
$$P((B \land C \land A) \lor (B \land C \land \lnot A)) = P(B \land C \land A) + P(B \land C \land \lnot A)
$$
Then, from the general product rule, it follows that:
$$ P(B \land C \land A) + P(B \land C \land \lnot A) = P(B \land C \mid A)P(A) + P(B \land C \mid \lnot A)P(\lnot A)
$$
So indeed, without knowing anything about $P(A)$, we can state that:
$$P(B \land C) = P(B \land C \mid A)P(A) + P(B \land C \mid \lnot A)P(\lnot A)$$
 

FAQ: Is Marginalization Always Valid for Joint Probabilities?

1. What is marginalizing a probability?

Marginalizing a probability is the process of summing over all possible values of one or more variables in a probability distribution in order to obtain the probability distribution of the remaining variables.

2. Why is it important to marginalize a probability?

Marginalizing a probability allows us to focus on the variables of interest and ignore any irrelevant variables. It also allows us to simplify complex probability distributions and make them more manageable to work with.

3. How is marginalizing a probability different from conditioning a probability?

Conditioning a probability involves fixing the value of one or more variables and calculating the probability of the remaining variables. Marginalizing a probability involves summing over all possible values of one or more variables to obtain the probability distribution of the remaining variables. In other words, conditioning focuses on a specific scenario, while marginalizing takes into account all possible scenarios.

4. Can marginalizing a probability change the values of the remaining variables?

No, marginalizing a probability does not change the values of the remaining variables. It only changes the way we look at the probability distribution by summing over all possible values of some variables.

5. How is marginalizing a probability used in real-world applications?

Marginalizing a probability is used in a variety of applications, including data analysis, machine learning, and statistical modeling. For example, in healthcare research, marginalizing probabilities can help determine the likelihood of a patient having a certain disease based on their symptoms and other risk factors.

Back
Top