- #1
Trollfaz
- 141
- 14
General linear model is
$$y=a_0+\sum_{i=1}^{i=k} a_i x_i$$
In regression analysis one always collects n observations of y at different inputs of ##x_i##s. n>>k or there will be many problems. For each regressor, and response y ,we tabulate all observations in a vector ##\textbf{x}_i## and ##\textbf{y}_i##, both is a vector of ##R^n##.So multicollinearity is the problem that there's significant correlation between the ##x_i##s. In practice some degree of multicollinearity exists. So perfectly no multicollinearity means all the ##\textbf{x}_i## are orthogonal to each other?ie.
$$\textbf{x}_i•\textbf{x}_j=0$$
For different i,j and strong multicollinearity means one of more of the vector makes a very small angle with the subspace form by the other vectors? As far as I know perfect multicollinearity means rank(X)<k. X is a n by k matrix with ith col as ##\textbf{x}_i##
$$y=a_0+\sum_{i=1}^{i=k} a_i x_i$$
In regression analysis one always collects n observations of y at different inputs of ##x_i##s. n>>k or there will be many problems. For each regressor, and response y ,we tabulate all observations in a vector ##\textbf{x}_i## and ##\textbf{y}_i##, both is a vector of ##R^n##.So multicollinearity is the problem that there's significant correlation between the ##x_i##s. In practice some degree of multicollinearity exists. So perfectly no multicollinearity means all the ##\textbf{x}_i## are orthogonal to each other?ie.
$$\textbf{x}_i•\textbf{x}_j=0$$
For different i,j and strong multicollinearity means one of more of the vector makes a very small angle with the subspace form by the other vectors? As far as I know perfect multicollinearity means rank(X)<k. X is a n by k matrix with ith col as ##\textbf{x}_i##