Linear Transformations if the design matrix

In summary: But to show ##H## is a projector you'll need the transpose of another matrix which has the same dimensions as ##H## to appear. In other words, the first step is to get the dimensions of each term to match up correctly. The second step is to show one side equals the other. You're close, but you're not quite there yet.What I was trying to do was to use the expression##{(A^TX^TXA)^-}^1##which equals##= {A^-}^1{(X^T)^-}^1{(X^TX)^-}^1A##which is why the dimensions of the ##X##'s were separated by
  • #1
Mark53
93
0

Homework Statement


given that X is an n × p matrix with linearly independent columns.

And $$X^∗ = XA$$ where A is an invertible p × p matrix.

a)

Show that: $$X^*{({X^*}^TX^*)^-}^1{X^*}^T = X{(X^TX)^-}^1X^T$$

b)
Consider two alternative models
$$M : Y = Xβ + ε$$ and $$M^∗ : Y = X^∗β ^∗ + ε$$

Show that $$η^∗ = η$$, i.e., the vector of fitted values is the same, whatever the form of the design matrix X.

The Attempt at a Solution



a)$$X^*{({X^*}^TX^*)^-}^1{X^*}^T = XA{(X^TA^TXA)^-}^1X^TA^T$$

multiplying by A inverse

$$=X{(X^TA^TXA)^-}^1X^TA^T$$

can I then multiply it by the A transpose to give:

$$=X{(X^TA^TXA)^-}^1X^TA^TA^T$$

$$=X{(X^TA^TXA)^-}^1X^TA$$

Then the inverse of A again to get

$$=X{(X^TA^TXA)^-}^1X^T$$

not sure if this is the right process

b)

not sure how to get started on this part
 
Physics news on Phys.org
  • #2
I gathered all scalars are in reals
- - - -

For starters: if
##X^∗ = XA##

then

##\big(X^∗\big)^T \neq X^TA^T##

do you get why ##H = X{(X^TX)^-}^1X^T## exists, and why ##H## is a projector? This is sometimes called the Hat Matrix and it's worth spending some time drilling it.

We can worry about part (b) later. I don't think you defined what ##\beta^*## is, or fully stated the definition of ##\eta## just yet.
 
Last edited:
  • #3
StoneTemplePython said:
I gathered all scalars are in reals
- - - -

For starters: if
##X^∗ = XA##

then

##\big(X^∗\big)^T \neq X^TA^T##

do you get why ##H = X{(X^TX)^-}^1X^T## exists, and why ##H## is a projector? This is sometimes called the Hat Matrix and it's worth spending some time drilling it.

We can worry about part (b) later. I don't think you defined what ##\beta^*## is, or fully stated the definition of ##\eta## just yet.

Wait would it be

$$(X^*)^T = (XA)^T= A^TX^T$$

which would give$$XA{(A^TX^TXA)^-}^1A^TX^T$$

Where would I go from here?
 
Last edited:
  • #4
Mark53 said:
Wait would it be

$$(X^*)^T = (XA)^T= A^TX^T$$

which would give$$XA{(A^TX^TXA)^-}^1A^TX^T$$

Where would I go from here?
Yes... now do a dimensions check...

is ##A## invertible? is ##X##?
 
  • #5
StoneTemplePython said:
Yes... now do a dimensions check...

is ##A## invertible? is ##X##?
We know that A is invertible which means we can:

$$XA{A^-}^1{(A^TX^TXA)^-}^1A^TX^T$$

$$=X{(A^TX^TXA)^-}^1A^TX^T$$
 
  • #6
I have no idea how you went from

##XA{(A^TX^TXA)^-}^1A^TX^T##

to

Mark53 said:
We know that A is invertible which means we can:

$$XA{A^-}^1{(A^TX^TXA)^-}^1A^TX^T$$

In general matrix multiplication is not commutative... I'm getting the sense that we're hitting a fundamental gap in your understanding of matrix algebra.

What I was hinting at was, for 3 invertible matrices ##Q, R, S##

##\big(QRS\big)^{-1}= S^{-1}R^{-1}Q^{-1}##

Using associativity of matrix multiplication you should be able to find 3 invertible matrices to apply that to, and go from there. Forum rules prohibit me giving the complete answer.

To be honest, if you're stumbling on transposing and inverting matrices, this problem may be out of reach. If this is for a course in statistics, the math will get more involved as you progress... I'd suggest drilling the linear algebra heavily over the next few weeks to get there. There are a lot of good free resources out there like this: https://math.byu.edu/~klkuttle/0000ElemLinearalgebratoprint.pdf , which has lots of problems and solutions to selected exercises.
 
  • #7
StoneTemplePython said:
I have no idea how you went from

##XA{(A^TX^TXA)^-}^1A^TX^T##

to
In general matrix multiplication is not commutative... I'm getting the sense that we're hitting a fundamental gap in your understanding of matrix algebra.

What I was hinting at was, for 3 invertible matrices ##Q, R, S##

##\big(QRS\big)^{-1}= S^{-1}R^{-1}Q^{-1}##

Using associativity of matrix multiplication you should be able to find 3 invertible matrices to apply that to, and go from there. Forum rules prohibit me giving the complete answer.

To be honest, if you're stumbling on transposing and inverting matrices, this problem may be out of reach. If this is for a course in statistics, the math will get more involved as you progress... I'd suggest drilling the linear algebra heavily over the next few weeks to get there. There are a lot of good free resources out there like this: https://math.byu.edu/~klkuttle/0000ElemLinearalgebratoprint.pdf , which has lots of problems and solutions to selected exercises.
##XA{(A^TX^TXA)^-}^1A^TX^T##

##=XA{A^-}^1{X^-}^1{(X^T)^-}^1{(A^T)^-}^1A^TX^T##
##=XI{X^-}^1{(X^T)^-}^1({A^-}^1A)^TX^T##
##=X{(X^TX)^-}^1IX^T##
##=X{(X^TX)^-}^1X^T##

Is this correct now?
 
  • #8
Mark53 said:
##XA{(A^TX^TXA)^-}^1A^TX^T##

##=XA{A^-}^1{X^-}^1{(X^T)^-}^1{(A^T)^-}^1A^TX^T##
##=XI{X^-}^1{(X^T)^-}^1({A^-}^1A)^TX^T##
##=X{(X^TX)^-}^1IX^T##
##=X{(X^TX)^-}^1X^T##

Is this correct now?
getting closer. The issue which I alluded to with "dimensions check" is that ##X## is not square so it cannot have an actual inverse... It is square after being multiplied by a certain other matrix which use or parenthesis (associativity) can alleviate.
 
  • #9
StoneTemplePython said:
getting closer. The issue which I alluded to with "dimensions check" is that ##X## is not square so it cannot have an actual inverse... It is square after being multiplied by a certain other matrix which use or parenthesis (associativity) can alleviate.

##X^TX##

That would be a square matrix, which means it has an inverse
 

FAQ: Linear Transformations if the design matrix

1. What is a linear transformation in the context of design matrices?

A linear transformation in design matrices refers to the process of transforming original data into a new coordinate system, often with the intention of simplifying the data or making it easier to interpret. It involves multiplying the original data by a matrix, called the design matrix, to create a new set of data points.

2. How is the design matrix used in linear transformations?

The design matrix is used to represent the transformation from one coordinate system to another. It is a matrix that contains the coefficients and variables of the linear model being used. By multiplying the original data by the design matrix, the new transformed data is created.

3. What is the importance of linear transformations in statistical analysis?

Linear transformations play a crucial role in statistical analysis as they allow for the manipulation and simplification of data. They also help in interpreting and visualizing data in a more meaningful way. Additionally, linear transformations are used in various statistical models such as regression and ANOVA to understand the relationship between variables.

4. Can a linear transformation change the shape of a dataset?

Yes, a linear transformation can change the shape of a dataset. For example, if a dataset is transformed using a rotation matrix, the original shape of the data will be altered. Similarly, other transformations such as scaling and shearing can also change the shape of a dataset.

5. How is the effectiveness of a linear transformation evaluated?

The effectiveness of a linear transformation is evaluated by examining the resulting data and determining if it has been simplified or made more interpretable. Additionally, statistical measures such as the coefficient of determination (R-squared) can be used to assess the goodness of fit of the transformed data in relation to the original data. It is also important to consider the purpose of the transformation and if it has achieved the intended goal.

Similar threads

Back
Top