Bivariate correlation does not always catch multicollinearity

  • I
  • Thread starter fog37
  • Start date
In summary, bivariate correlation measures the relationship between two variables but may fail to detect multicollinearity, which occurs when three or more variables are interrelated. This limitation can lead to misleading conclusions in regression analysis, as high correlations among multiple predictors may go unnoticed when only pairwise relationships are examined. Understanding the distinction is crucial for accurate statistical modeling and interpretation.
  • #1
fog37
1,569
108
TL;DR Summary
Bivariate correlation does not always catch multicollinearity
Hello,

While studying multicollinearity, I learned that if there are more than 2 predictors ##X##, for examples 3 predictors ##X_1, X_2, X_3##, it may be possible for all the possible pairwise correlations to be low in value but multicollinearity to still be an issue...That would mean that the "triple" correlation, i.e. the average of the products ##(X_1 X_2 X_3)##, would have a high value (higher than 0.7)...Is that correct?

Would you a have a simple example of how three variables may be correlated collectively even if their pairwise correlation is low?

Thank you!
 
Physics news on Phys.org
  • #2
In a visual sense, using Venn diagrams, how can the predictors be correlated all together if they are not pairwise correlated at all? The figures below show moderate multicollinearity and strong multicollinearity. I don't see how the ##X## circles cannot overlap and still cause multicollinearity...

1704308542699.png
 
  • #3
It may depend on how low you demand the individual pairwise correlations to be. Suppose that ##X_1## and ##X_2## are independent, identically distributed random variables and that ##Y = X_1+X_2##. Then I think it is clear that the correlation of ##Y## with any one ##X_i## may be smaller than the threshold even though ##Y## is a deterministic function of ##X_1, X_2##.
In fact, it gets easier when ##Y## is a function of more independent ##X_i## variables. Any one ##X_i## might have a low correlation with ##Y## but the combination of all the ##X_i##s might completely determine ##Y##. Suppose ##Y = X_1+X_2+...+X_{100}##, where the ##X_i##s are pairwise independent.
 
  • Like
Likes Office_Shredder and fog37
  • #4
FactChecker said:
It may depend on how low you demand the individual pairwise correlations to be. Suppose that ##X_1## and ##X_2## are independent, identically distributed random variables and that ##Y = X_1+X_2##. Then I think it is clear that the correlation of ##Y## with any one ##X_i## may be smaller than the threshold even though ##Y## is a deterministic function of ##X_1, X_2##.
In fact, it gets easier when ##Y## is a function of more independent ##X_i## variables. Any one ##X_i## might have a low correlation with ##Y## but the combination of all the ##X_i## s might completely determine ##Y##. Suppose ##Y = X_1+X_2+...+X_{100}##, where the ##X_i## are pairwise independent.
Processing...multicollinearity is when the predictors are correlated in such a way that the estimated coefficient for a predictor, which would indicate the change in ##Y## per unit change in ##X##, is not what it is really is because ##X_1## and ##X_2## are correlated so when ##X_1## changes by one unit we cannot hold ##X_2## fixed and it changes too...

Let's say ##Y=b_1 X_1 + b_2 X_2 + b_3 X_3##...and the predictors ##Xs## are pairwise linearly independent with the correlation coefficients being low: ##r_{12} = r_{13} = r_{23} \approx 0.2##. That is not an automatic proof of lack of multicollinearity...

It could be ##r_{123} \approx 0.8##... But could that be? How can they collectively be more correlated than pairwise? I am struggling to see that, especially visually using the Venn diagram where each smaller circle represents the variance of ##X## and the larger circle is the variance of ##Y##...
 
  • #5
Oh, maybe I get it now...It could be that ##Y=\beta_1 X_1 +\beta_2 X_2 + \beta_3 X_3## and the three regressors are pairwise uncorrelated to each other BUT the correlations between ##X_1## and, for example, the variable given by the sum ##X_2+X_3## to be nonzero and high in value. Same goes for the correlation btw ##X_2## and ##X_1+X_3##, etc.

I think that is what the variance inflation factor (VIF) does in checking these correlation combinations, which cannot be visualized with the Venn diagrams of the individual predictors and response variables, instead of focusing on the pairwise correlations...
 
  • #6
fog37 said:
Oh, maybe I get it now...It could be that ##Y=\beta_1 X_1 +\beta_2 X_2 + \beta_3 X_3## and the three regressors are pairwise uncorrelated to each other BUT the correlations between ##X_1## and, for example, the variable given by the sum ##X_2+X_3## to be nonzero and high in value.
Not if the ##X_i##s are independent. Then ##X_1## would be uncorrelated to ##X_2+X_3##.

I probably should leave this for others since I am not an expert. But if ##Y = X_1+X_2##, where the ##X##s are independent, then ##Y, X_1, X_2## are all estimators of ##Y## to varying extents. ##X_1## and ##X_2## are independent. ##Y## is somewhat correlated to an individual ##X_i##, but completely determined by the pair. The more ##X_i##s there are in the sum, the weaker would be the correlation between ##Y## and the individual ##X_i##s.
 

FAQ: Bivariate correlation does not always catch multicollinearity

What is multicollinearity?

Multicollinearity refers to a situation in statistical modeling where two or more predictor variables in a multiple regression model are highly correlated, meaning that they contain similar information about the variance in the dependent variable. This can make it difficult to determine the individual effect of each predictor on the dependent variable.

Why can't bivariate correlation always detect multicollinearity?

Bivariate correlation measures the strength and direction of the linear relationship between two variables. However, multicollinearity involves the interrelationships among three or more variables. Even if the bivariate correlations between pairs of variables are low, a combination of these variables can still exhibit multicollinearity when considered together in a multiple regression model.

How can multicollinearity be detected if not through bivariate correlation?

Multicollinearity can be detected using several methods, such as examining the Variance Inflation Factor (VIF) for each predictor variable, checking the condition index, or analyzing the correlation matrix of the predictor variables. High VIF values (typically above 10) or a high condition index (typically above 30) indicate potential multicollinearity.

What are the consequences of multicollinearity in a regression model?

Multicollinearity can inflate the standard errors of the coefficient estimates, making them less reliable and leading to less precise estimates. This can result in statistically insignificant predictor variables even if they are important. It can also make the model coefficients highly sensitive to changes in the model, making the interpretation of the coefficients difficult.

How can multicollinearity be addressed or mitigated?

Several approaches can be used to address multicollinearity, including removing highly correlated predictors, combining correlated variables into a single predictor through techniques like Principal Component Analysis (PCA), or using regularization methods such as Ridge Regression or Lasso Regression that can handle multicollinearity by adding a penalty to the regression coefficients.

Similar threads

Replies
5
Views
1K
Replies
5
Views
2K
Replies
3
Views
1K
Replies
22
Views
3K
Replies
21
Views
3K
Back
Top