IID and Dependent RVs: A Closer Look at their Relationship and Parameters

  • I
  • Thread starter member 428835
  • Start date
Here, we can see that the current value of the time series depends on the previous values and the noise terms. In the same way, the two random variables, ##X_1## and ##X_2##, depend on the same noise term, ##\epsilon_1##, making them dependent.
  • #1
member 428835
If ##\epsilon_1,\epsilon_2## are iid ##N(0,1)##, ##X_1=\mu_1+\sigma_1 \epsilon_1## and ##X_2=\mu_2+\rho\epsilon_1+\sigma_2 \epsilon_2## are evidently a pair of dependent RVs that are not identically distributed for most values of the parameters. I have no idea what ##\mu,\sigma,\rho## are. I assume ##\mu## is mean and ##\sigma## is standard deviation? I read this example here.
 
Physics news on Phys.org
  • #2
joshmccraney said:
If ##\epsilon_1,\epsilon_2## are iid ##N(0,1)##, ##X_1=\mu_1+\sigma_1 \epsilon_1## and ##X_2=\mu_2+\rho\epsilon_1+\sigma_2 \epsilon_2## are evidently a pair of dependent RVs that are not identically distributed for most values of the parameters. I have no idea what ##\mu,\sigma,\rho## are. I assume ##\mu## is mean and ##\sigma## is standard deviation? I read this example here.
The example in the link does not say exactly what they are, but we can make the reasonable assumption that he is using the very common notations, where all of the ##\mu##s, ##\epsilon##s, and ##\sigma##s are real number constants EDIT: with ##\epsilon##s, and ##\sigma##s positive. In that case, ##\mu_1## is mean and ##\sigma_1## is standard deviation of ##X_1##. Also, ##\mu_2## is mean of ##X_2##. The standard deviation of ##X_2## is more complicated. The SD of the individual terms is ##\rho## and ##\sigma_2##, but the sum of those has a variance which is the sum of the variances
 
Last edited:
  • #3
FactChecker said:
The example in the link does not say exactly what they are, but we can make the reasonable assumption that he is using the very common notations, where all of the ##\mu##s, ##\epsilon##s, and ##\sigma##s are real number constants. In that case, ##\mu_1## is mean and ##\sigma_1## is standard deviation of ##X_1##. Also, ##\mu_2## is mean of ##X_2##. The standard deviation of ##X_2## is more complicated. The SD of the individual terms is ##\rho## and ##\sigma_2##, but the sum of those has a variance which is the sum of the variances
Okay, thanks. So I'm missing the crux: why are these dependent instead of independent? It seems to be because ##\epsilon_1## is a function of ##X_1##, and so ##X_2## implicitly depends on ##X_1##?
 
  • #4
No. ##\epsilon_1## is not a function of ##X_1##. It is the other way around.
It is not that ##X_2## depends on ##X_1##. It is better to understand that they both depend on ##\epsilon_1##, so their tendencies are related. That makes them correlated and not independent.

(PS. I don't like to say that ##X_1## and ##X_2## are dependent until you are comfortable with what that means in probability. It just means that the tendencies of one give a hint to the tendencies of the other. It does not mean the functional dependency that you are probably used to.)
 
  • Like
Likes member 428835
  • #5
FactChecker said:
No. ##\epsilon_1## is not a function of ##X_1##. It is the other way around.
It is not that ##X_2## depends on ##X_1##. It is better to understand that they both depend on ##\epsilon_1##, so their tendencies are related. That makes them correlated and not independent.

(PS. I don't like to say that ##X_1## and ##X_2## are dependent until you are comfortable with what that means in probability. It just means that the tendencies of one give a hint to the tendencies of the other. It does not mean the functional dependency that you are probably used to.)
Perfect explanation, thanks so much!
 
  • #6
Consider human arm length and leg length. Clearly, they are related and not independent, yet there are many other factors involved and one does not cause the other. It is just that their tendencies are related.
 
  • Like
Likes member 428835
  • #7
For one concrete example, one can see a similar form in Moving Average models in time series analysis

1650297202304.png


https://en.wikipedia.org/wiki/Moving-average_model
 
  • Like
Likes FactChecker

FAQ: IID and Dependent RVs: A Closer Look at their Relationship and Parameters

What is IID and how is it related to dependent random variables?

IID stands for independent and identically distributed, which means that a set of random variables are independent of each other and have the same probability distribution. Dependent random variables, on the other hand, are not independent and their probability distribution may vary. Therefore, IID and dependent random variables are two different types of random variables, with IID being a special case of dependent random variables.

How can we determine if a set of random variables are IID or dependent?

To determine if a set of random variables are IID or dependent, we can perform statistical tests such as the chi-square test or the Kolmogorov-Smirnov test. These tests compare the observed data to the expected data based on the assumption that the variables are IID. If the p-value is greater than the significance level, we can conclude that the variables are IID. If the p-value is less than the significance level, we can reject the assumption of IID and conclude that the variables are dependent.

What are the parameters that describe IID and dependent random variables?

The parameters that describe IID random variables are the mean and variance, which are constant for all variables in the set. For dependent random variables, the parameters may vary and can include the correlation coefficient, covariance, and autocorrelation. These parameters describe the relationship between the variables and can help us understand their dependence.

Can IID and dependent random variables be used interchangeably in statistical analysis?

No, IID and dependent random variables cannot be used interchangeably in statistical analysis. IID random variables have specific properties that make them suitable for certain types of analysis, such as hypothesis testing and regression analysis. Dependent random variables have different properties and require different methods of analysis. Using the wrong type of random variables can lead to incorrect conclusions and unreliable results.

How can understanding the relationship between IID and dependent random variables be useful in real-world applications?

Understanding the relationship between IID and dependent random variables can be useful in many real-world applications, such as finance, economics, and engineering. For example, in finance, understanding the dependence between stock prices can help investors make more informed decisions. In economics, understanding the dependence between different economic indicators can help policymakers make better predictions. In engineering, understanding the dependence between variables can help improve the design and performance of systems and processes.

Similar threads

Back
Top