Proving Similarity of Matrices: A Faster Approach

  • Thread starter loli12
  • Start date
  • Tags
    Matrices
In summary, the conversation discusses proving the similarity of two matrices and the importance of understanding the geometric meaning of similarity. It is shown that having equal trace is not sufficient for two matrices to be similar and that a change of basis can be used to show similarity. It is also mentioned that Q does not have to be unique in order for two matrices to be similar. Finally, the concept of similarity as an equivalence relation is explained and demonstrated through the given example.
  • #1
loli12
[SOLVED] Similar matrices

I was given 2 matrices and need to prove that they are similar,
after i performed row operations on it, i got
A =
[100]
[040]
[006]
and B =
[600]
[040]
[001]
I was stupid enough for not using the fact that their trace are equal to prove it. and instead, keep figuring out the invertible matrix Q that satisfies A=(Q^-1)BQ. But still, i can't figure out the matrix Q that does the work..
Can anyone tell me if there's any other way to prove the 2 matrices ar similar but do not deal with the trace? or any fast way to figure out Q?
 
Physics news on Phys.org
  • #2
Having equal trace isn't sufficient for two matrices to be similar! Consider the diagonal matrices A and B, with A's diagonal entries being (2, 0) and B's diagonal entries being (1, 1).

I think this is a case of not understanding the geometric meaning of similarity! (Which, in this case at least, makes the similarity transformation obvious)

Suppose A and B are similar, so that [itex]A = Q B Q^{-1}[/itex]. This is equivalent to saying that [itex]AQ = QB[/itex].

Now, for any vector v, we have [itex]A(Qv) = Q (Bv)[/itex].

Can you interpret that equation geometrically? (This question is an important one! If you cannot answer it, keep in mind that one of the things you're supposed to get out of this class is to develop your geometric intuition, so think about it)

Even if your geometric intuition is failing, you could appeal to your algebraic intuition! (Same comment applies) When you want to use algebra on matrices, it is often a very good idea to study what they do to certain vectors...


This approach won't work for all matrices, incidentally, but it will for all "good" ones.
 
  • #3
Well, i think A(Qv) means you first express v with respect to a basis and then A transform Qv to another space? and Q(Bv) is the reverse of it?...
 
  • #4
Right -- so, it sounds like the fact that A and B are similar means that we can find a change of basis Q such that the following two procedures are the same:

Change the basis, then apply A.
Apply B, then change the basis.

So that A and B are really doing the "same thing" (thus the term similar), just with regards to different bases.


Your two given matrices are really easy to understand geometrically -- what change of basis is suggested here?
 
  • #5
Thanks for prompt reply!
so, i guess i can let Q to be a permutation matrix that reorders the row/col
[001]
[010]
[100]
just want to make sure, so as long as I can find a Q that fits AQ=QB, that automatically shows A ~ B and Q doesn't have to be unique..
 
  • #6
just want to make sure, so as long as I can find a Q that fits AQ=QB, that automatically shows A ~ B and Q doesn't have to be unique..
Almost: you have to find an invertible Q. Q can't be unique, because if

Q couldn't possibly be unique, because 2Q wouls also suffice. However, there do exist matrices that are similar in more substantially different ways. How many ways is I similar to I?
 
  • #7
I have to use my geometric intuition. :frown:

I must be stupid because I have no clue how that is presented. I know about similar matrices, and how to determine whether or not they are similiar, and how to find a matrix P, such that A = P^-1 B P.

I just read the theorems, and they all make sense. The thought of how the proof is going to go is something I can usually think of because they even go on with it. Sometimes the proof is tricky, which I can't get on the spot, but I still know intuitively it is true and what not.
 
  • #8
Geometrically, Hurkyl implies that two linear transformations are similar if they perform the same action within linear transformations of the space. Ie., if A only dilates one subspace by t and B only dilates one subspace by t, they are similar, because A can be carried out by simply rotating the relevant subspace into the subspace dilated by B, then rotating it back. Also, a scaling of a single axis by a negative value is similar to a rotation of the axis by [itex]\pi[/itex] (where the intervening linear transformation is the identity).
Similarly, all matrices of the same linear transformation with respect to different bases are similar. Looking at similarity as an equivalence relation will pick out the set of unique linear transformations on a space.
In this respect, we can suppose A in the original problem acts on the standard basis, which then is transformed to the basis listed in your matrix (column vectors). B is then obviously similar; it can carry out A by transforming the standard basis, dilating the relevant subspaces, then transforming it back. The y-axis remains the same, the x-axis is transformed to the positive z-axis and the z-axis is transformed to the positive x-axis. This can only be done by an inversion of the xz-plane, whose matrix you can then derive.
 

FAQ: Proving Similarity of Matrices: A Faster Approach

What is the purpose of proving similarity of matrices?

The purpose of proving similarity of matrices is to determine whether two matrices are essentially the same, with the only difference being a change in basis. This allows for easier computation and analysis of matrix operations.

How is similarity of matrices typically proven?

Traditional methods of proving similarity of matrices involve finding a change of basis matrix and performing matrix multiplication. This can be time-consuming and computationally intensive, especially for larger matrices.

What is the faster approach for proving similarity of matrices?

The faster approach for proving similarity of matrices is by using the spectral theorem. This involves analyzing the eigenvalues and eigenvectors of the matrices, which can be done more efficiently than traditional methods.

Can the spectral theorem be used for all types of matrices?

Yes, the spectral theorem can be used for all types of matrices, including square matrices, symmetric matrices, and even non-square matrices.

Are there any limitations to the spectral theorem for proving similarity of matrices?

While the spectral theorem is a faster approach for proving similarity of matrices, it does have some limitations. It can only be used for matrices that have distinct eigenvalues, and it may not work for matrices with complex eigenvalues.

Similar threads

Replies
5
Views
4K
Replies
20
Views
2K
Replies
2
Views
3K
Replies
9
Views
2K
Replies
5
Views
10K
Replies
69
Views
5K
Replies
15
Views
22K
Replies
2
Views
3K
Back
Top