- #1
CoolFool
- 4
- 0
Hey everyone,
I've been going thru a linear alg textbook and there's this one theorem I can't quite follow. It proves that the elements of A(adj A) are |A| if on the diagonal and are 0 if not. Therefore, A(adj A) = |A|I.
The first part shows that along the diagonal of this product A(adj A), the resulting elements are |A| because they are determined by multiplying the row vector of A by its corresponding cofactors of the corresponding column in adj A. This first part I understand, even if I didn't put it very clearly.
The main theorem is: for any matrix A, [itex]\sum^{n}_{k=1}a_{tk}c_{ik} = \delta_{ti}|A|[/itex], where [itex]\delta_{ti}[/itex] is 1 if t = i, and 0 otherwise.
As an example, a 3x3 matrix B is constructed by replacing the second row vector of A with any of A's row vectors, including its second row vector. If the second row vector is replaced with itself, B=A and so if you multiply that row by its corresponding column of adj A, you get |A|, which we already know.
So far I follow, though constructing B as an example seems awkward. But the next step confuses me. Now, we have to prove that the elements not on the diagonal are equal to zero. The example is that if B's second row is replaced with another row, B has two identical rows and you can subtract one from the other and get a zero row vector, which means that |B| = 0. The rule where a matrix with 2 identical rows has a determinant of zero makes sense to me, but I don't get why the construction of B has anything to do with A(adj A). That's what throws me off: I don't understand the connection between this product and B.
What am I missing? I feel like it's right in front of me but I can't see it!
Thanks for your help!
I've been going thru a linear alg textbook and there's this one theorem I can't quite follow. It proves that the elements of A(adj A) are |A| if on the diagonal and are 0 if not. Therefore, A(adj A) = |A|I.
The first part shows that along the diagonal of this product A(adj A), the resulting elements are |A| because they are determined by multiplying the row vector of A by its corresponding cofactors of the corresponding column in adj A. This first part I understand, even if I didn't put it very clearly.
The main theorem is: for any matrix A, [itex]\sum^{n}_{k=1}a_{tk}c_{ik} = \delta_{ti}|A|[/itex], where [itex]\delta_{ti}[/itex] is 1 if t = i, and 0 otherwise.
As an example, a 3x3 matrix B is constructed by replacing the second row vector of A with any of A's row vectors, including its second row vector. If the second row vector is replaced with itself, B=A and so if you multiply that row by its corresponding column of adj A, you get |A|, which we already know.
So far I follow, though constructing B as an example seems awkward. But the next step confuses me. Now, we have to prove that the elements not on the diagonal are equal to zero. The example is that if B's second row is replaced with another row, B has two identical rows and you can subtract one from the other and get a zero row vector, which means that |B| = 0. The rule where a matrix with 2 identical rows has a determinant of zero makes sense to me, but I don't get why the construction of B has anything to do with A(adj A). That's what throws me off: I don't understand the connection between this product and B.
What am I missing? I feel like it's right in front of me but I can't see it!
Thanks for your help!