Is X and Y's Dependence Evident from Their Joint Density Function g(x, y)?

In summary: Var(X+Y) = ∫∫ (x+y)^2 * g(x,y) dx dy - (∫∫ (x+y) * g(x,y) dx dy)^2Again, using the property of independence, we can rewrite this as:Var(X+Y) = ∫∫ (x+y)^2 * g_X(x) * g_Y(y) dx dy - (∫ x * g_X(x) dx * ∫ y * g_Y(y) dy)^2And again, we are left with an integral over the joint density instead of being able to break it down into the product of the marginals.Now, let's consider what it
  • #1
jam_33
3
0

Homework Statement



let f(x) be a density on R+ (so f(x) < 0 if x < 0). Let g(x,y) = f(x+y)/(x+y), x > 0, y> 0
a) show g is a density on R^2
b) assume that the expectation u and variance sigma^2 associated univariate density f exist and that mu^2 does not equal 2sigma^2. Show that X and Y are dependent.

Homework Equations





The Attempt at a Solution



I have part a done. As for part b I am terribly confused. I first took the question as meaning E[X] = mu and E[Y] = mu. Then I assume that X,Y were independent for a proof by contradiction. So, E[XY] = mu^2 and Var(X+y) = 2sigma^2 (since cov(X,Y) = 0 and also assuming Var(X) = sigma^2 = Var(Y)). Then from the question we would assume that E[XY] does not equal Var(X+Y) from which I got nowhere.

So I thought I would have to go back and play with the joint density g(x,y) and use the definition of independence for a joint density: g(x,y) = g_Y(y) g_X(x) except I have no clue on how to the integral for either marginal.

Any hints as to how to attack this problem would be greatly appreciated :)
 
Physics news on Phys.org
  • #2

Thank you for your question. Let me try to provide some guidance on how to approach part b of this problem.

First, let's review what we know about the variables X and Y and their joint density g(x,y). We know that X and Y are both positive random variables (since x and y must be greater than 0 for the given function to be defined), and that g(x,y) is a density on R^2, meaning that it must satisfy the following conditions:

1. g(x,y) is non-negative for all x and y
2. The integral of g(x,y) over all values of x and y is equal to 1

Now, let's consider what it means for X and Y to be independent. This means that the joint density g(x,y) can be written as the product of the marginal densities g_X(x) and g_Y(y), as you mentioned. In other words:

g(x,y) = g_X(x) * g_Y(y)

We can also express this in terms of integrals:

∫∫ g(x,y) dx dy = ∫ g_X(x) dx * ∫ g_Y(y) dy

Now, let's consider the expected value of XY, given by E[XY]. This can be calculated as follows:

E[XY] = ∫∫ xy * g(x,y) dx dy

Using the property of independence that we just discussed, we can rewrite this as:

E[XY] = ∫∫ xy * g_X(x) * g_Y(y) dx dy

Now, we can use the definition of expected value to rewrite this as:

E[XY] = ∫ x * g_X(x) dx * ∫ y * g_Y(y) dy

This may look familiar - it is the same expression that we saw for the expected value of X and Y when we assumed independence. However, in this case, we are not assuming independence - we are trying to show that it is not the case. So, instead of being able to break down the joint density into the product of the marginals, we are left with an integral over the joint density itself.

Now, let's consider the variance of X+Y, given by Var(X+Y). This can be calculated as follows:

Var(X+Y) = E[(X+Y)^2] - E[X+Y]^2

Using the definition of
 

Related to Is X and Y's Dependence Evident from Their Joint Density Function g(x, y)?

1. What is the difference between descriptive and inferential statistics?

Descriptive statistics involves summarizing and describing a set of data, while inferential statistics involves making conclusions or predictions about a larger population based on a sample of data.

2. What is the purpose of hypothesis testing in mathematical statistics?

Hypothesis testing is used to determine whether there is a significant difference between two or more groups or variables. It helps to make decisions based on data and provides evidence for or against a specific hypothesis.

3. How do you calculate the mean, median, and mode of a dataset?

The mean is calculated by adding all the values in a dataset and dividing by the total number of values. The median is the middle value when the data is arranged in ascending or descending order. The mode is the most frequently occurring value in the dataset.

4. What is the Central Limit Theorem and why is it important in mathematical statistics?

The Central Limit Theorem states that when a sample size is large enough, the sampling distribution of the mean of any variable will be approximately normal, regardless of the shape of the population distribution. This is important because it allows us to use normal distribution assumptions in statistical analysis, making calculations and interpretations easier.

5. How is correlation different from causation?

Correlation is a statistical measure that shows the relationship between two variables. It does not necessarily imply causation, meaning that just because two variables are correlated, it does not mean that one causes the other. Causation requires further evidence and analysis to prove a direct cause and effect relationship between two variables.

Similar threads

  • Calculus and Beyond Homework Help
Replies
8
Views
455
  • Calculus and Beyond Homework Help
Replies
2
Views
681
  • Calculus and Beyond Homework Help
Replies
2
Views
859
  • Calculus and Beyond Homework Help
Replies
19
Views
2K
Replies
11
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
15
Views
2K
  • Calculus and Beyond Homework Help
Replies
11
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
Back
Top