MHB Question about Bayesian Inference, Posterior Distribution

AI Thread Summary
To find the probability of event E, given the posterior distribution of the probabilities \( p_i \), one must recognize that \( p_i \) is a random variable. The correct approach is to express the probability as \( P(E|p_1,...,p_i,...,p_{|I|})=\prod_{i \in I} p_i^{k_i}(1-p_i)^{1-k_i} \). The goal is to compute the marginal probability \( P(E) \) by integrating over the distribution of \( p_i \). This involves using the posterior distribution of \( p_i \) to evaluate the integral for \( P(E) \). Understanding the relationship between the posterior distribution and the event probabilities is crucial for accurate computation.
thehairygorilla
Messages
2
Reaction score
0
I have a posterior probability of $$p_i $$which is based on a Beta prior and some data from a binomial distribution:

I have another procedure:

$P(E)=\prod_{i \in I} p_i^{k_i}(1-p_i)^{1-k_i}$

which gives me the probability of a specific event of successes and failures for the set of $I$ in a model. Given the posterior distribution for $p_i$, how do I find $$P(E)$$?
 
Physics news on Phys.org
thehairygorilla said:
I have a posterior probability of $$p_i $$which is based on a Beta prior and some data from a binomial distribution:

I have another procedure:

$P(E)=\prod_{i \in I} p_i^{k_i}(1-p_i)^{1-k_i}$

which gives me the probability of a specific event of successes and failures for the set of $I$ in a model. Given the posterior distribution for $p_i$, how do I find $$P(E)$$?

Hi thehairygorilla, welcome to MHB!

The event $E$ consists of a combination of $k_i$ for $i\in I$.
To find the probability $P(E)$ we would fill in those $k_i$ and the given $p_i$ in the formula, wouldn't we?
 
I like Serena said:
Hi thehairygorilla, welcome to MHB!

The event $E$ consists of a combination of $k_i$ for $i\in I$.
To find the probability $P(E)$ we would fill in those $k_i$ and the given $p_i$ in the formula, wouldn't we?

So not really. $p_i$ is a random variable. Better notation would be $P(E|p_1,...,...p_i,...,p_{|I|})=\prod_{i \in I} p_i^{k_i}(1-p_i)^{1-k_i}$ and I would be trying to find the marginal probability $P(E)$. Given the $p_i$s, $P(E|p_1,...,...p_i,...,p_{|I|})$ would be in terms of those random variables.
 
Last edited:
I was reading a Bachelor thesis on Peano Arithmetic (PA). PA has the following axioms (not including the induction schema): $$\begin{align} & (A1) ~~~~ \forall x \neg (x + 1 = 0) \nonumber \\ & (A2) ~~~~ \forall xy (x + 1 =y + 1 \to x = y) \nonumber \\ & (A3) ~~~~ \forall x (x + 0 = x) \nonumber \\ & (A4) ~~~~ \forall xy (x + (y +1) = (x + y ) + 1) \nonumber \\ & (A5) ~~~~ \forall x (x \cdot 0 = 0) \nonumber \\ & (A6) ~~~~ \forall xy (x \cdot (y + 1) = (x \cdot y) + x) \nonumber...
Back
Top