A simple baysian question on likelihood

  • Thread starter ghostyc
  • Start date
  • Tags
    Likelihood
In summary, the conversation discusses a question about calculating the likelihood function for independent and identically distributed variables with a random parameter. The likelihood function is given in two different forms, and help is needed for part (c). The conversation also touches on the idea of using calculus to find the maximum value of a function and discusses the terminology of "discrete" versus "piecewise".
  • #1
ghostyc
26
0
Hi all,

attachment.php?attachmentid=35247&stc=1&d=1304691641.jpg


In this question, I found that

[tex] \Pr(X_i|\theta)=\frac{\exp(\theta x_i)}{1+\exp(\theta)} [/tex]

and I carry on with the likelihood being [tex] \frac{\exp(\theta \sum x_i)}{(1+\exp(\theta))^n} [/tex]

and so [tex] s=\sum x_i = Tn[/tex]

I need some help with part (c).

========================================
attachment.php?attachmentid=35248&d=1304691641.jpg



How do I start with this question then?

I couldn't get a general expression for [tex] \Pr(X_i|\theta) [/tex].


How do I generally deal with these "piecewise probability density function"?

Thanks!
 

Attachments

  • 6.jpg
    6.jpg
    32.3 KB · Views: 419
  • 3.jpg
    3.jpg
    18.6 KB · Views: 404
Physics news on Phys.org
  • #2
Suppose that [itex] X_1,...,X_n[/itex] are independent and identically distributed conditional on a random parameter [itex] \theta > 0 [/itex] such that
[tex] P(X_i = 1|\theta) = \frac{e^\theta}{1+e^\theta}[/tex] and [tex] P(X_i = 0| \theta) = 1 - P(X_i=1| \theta) [/tex].

A normal prior for [itex] \theta [/itex] is taken with mean [itex] \mu [/itex] and variance [itex] \sigma^2 [/itex].

(a) Show that the liklihood function is of this type

[tex] l(\theta) = \frac{e^{\theta S}}{(1 + e^\theta)^n} = \big{(}\frac{e^{\theta T}}{1 + e^\theta}\big{)}^n [/tex] and find [itex]S[/itex] and [itex]T[/itex] in terms of [itex] \{X_1,X_2,...X_n\} [/itex].

(b) Hence, write down the posterior distribution for [itex] \theta [/itex].

(c) For any [itex] 0 < t < 1 [/itex] and [itex] \theta > 0[/itex], show that

[tex] \frac{e^{\theta t }}{1 + e^\theta} < t^t(1-t)^{1-t} [/tex] .

In this question, I found that

[tex] \Pr(X_i|\theta)=\frac{\exp(\theta x_i)}{1+\exp(\theta)} [/tex]

and I carry on with the likelihood being [tex] \frac{\exp(\theta \sum x_i)}{(1+\exp(\theta))^n} [/tex]

and so [tex] s=\sum x_i = Tn[/tex]

I need some help with part (c).

One thought that comes to mind is to use calculus to see if the function
[tex] f(\theta) = (\theta)^S ( 1 - \theta)^{n-S} [/tex] takes its maximum value when [tex] \theta = S/N [/tex]. If so, then [tex] (f(\theta))^{1/n} [/tex] also takes its maximum value there.
 
  • #3
---------------
Suppose that [itex] X_1,...,X_n[/itex] are independent and identically distributed conditional on a random parameter [itex] \theta > 0 [/itex] such that
for each [itex] i = 1,2,...n [/itex]

[tex] P(X_i = -1|\theta) = \theta}[/tex] and [tex] P(X_i = 1| \theta) = 1 - \theta [/tex].

(a) Show that the liklihood function is given by the form

[tex] l(\theta) = \theta^{n/2 - S} (1-\theta)^{n/2 + S} [/tex] and find [itex]S[/itex] in terms of [itex] \{X_1,X_2,...X_n\} [/itex].
---------------

let [itex] k [/tex] be the number of the [itex] X_i [/itex] that are +1 and let [itex] S = n/2 - k [/itex]

Isn't "discrete" a better term than "piecewise"?
 

Related to A simple baysian question on likelihood

1. What is Bayesian probability and how is it different from traditional probability?

Bayesian probability is a statistical method for calculating the likelihood of an event occurring based on prior knowledge or beliefs. It differs from traditional probability by incorporating prior knowledge or beliefs into the calculation, whereas traditional probability relies solely on observed data.

2. How is likelihood used in Bayesian probability?

Likelihood is a key component of Bayesian probability. It represents the probability of observing a particular set of data given a specific hypothesis or belief. This likelihood is then combined with prior knowledge or beliefs to calculate the posterior probability, which represents the updated probability of the hypothesis being true.

3. How do you determine the prior probability in Bayesian probability?

The prior probability is determined by using prior knowledge or beliefs about the event or hypothesis. This can be based on previous data, expert opinions, or subjective beliefs. The prior probability can also be updated as new data or evidence is collected.

4. What are the advantages of using Bayesian probability?

One advantage of using Bayesian probability is that it allows for the incorporation of prior knowledge or beliefs, which can lead to more accurate and nuanced predictions. Additionally, Bayesian probability can be updated as new data is collected, allowing for more flexibility in the analysis.

5. What are the limitations of using Bayesian probability?

One limitation of using Bayesian probability is that it relies on the accuracy of the prior knowledge or beliefs. If the prior is incorrect, the posterior probability may also be inaccurate. Additionally, Bayesian probability can be computationally intensive and may require a large amount of data to accurately estimate the posterior probability.

Similar threads

Replies
6
Views
1K
Replies
16
Views
2K
Replies
1
Views
913
Replies
1
Views
1K
Replies
2
Views
1K
Replies
1
Views
1K
Back
Top