Power of a Diagonalized Matrix?

In summary: A }_{ ij } }$$In summary, the conversation discusses how to prove that if a matrix $$D$$ is diagonal, then the matrix $$D^{n}$$ is also diagonal with elements equal to the nth power of the elements of $$D$$. Different approaches are suggested, including using trace and determinant, using index notation, and using summation notation. The final solution involves using the definition of matrix multiplication and realizing that the only nonzero terms will be when the indices are equal.
  • #1
kq6up
368
13

Homework Statement



From Mary Boas' "Mathematical Methods in the Physical Sciences 3rd Ed."

Chapter 3 Section 11 Problem 57

Show that if $$D$$ is a diagonal matrix, then $$D^{n}$$ is the diagonal matrix with elements equal to the nth power of the elements of $$D$$.

Homework Equations



I think [itex]D^{n}=C^{-1}M^{n}C[/itex]

The Attempt at a Solution



I think this hints at this being the case, but I don't think it proves it:

$$Tr(D^{n})=Tr(C^{-1}CM^{n})=Tr(M^{n})$$

I also tried using the summation form of matrix multiplication, but I am not familiar enough with the formalism to feel confident about what I am doing.

Furthermore, I think it is rather obvious that this is the case if one just multiplies a diagonal matrix by itself.

Any hints as to a different approach?

Thanks,
Chris Maness
 
Physics news on Phys.org
  • #2
kq6up said:

Homework Statement



From Mary Boas' "Mathematical Methods in the Physical Sciences 3rd Ed."

Chapter 3 Section 11 Problem 57

Show that if $$D$$ is a diagonal matrix, then $$D^{n}$$ is the diagonal matrix with elements equal to the nth power of the elements of $$D$$.

Homework Equations



I think [itex]D^{n}=C^{-1}M^{n}C[/itex]

The Attempt at a Solution



I think this hints at this being the case, but I don't think it proves it:

$$Tr(D^{n})=Tr(C^{-1}CM^{n})=Tr(M^{n})$$

I also tried using the summation form of matrix multiplication, but I am not familiar enough with the formalism to feel confident about what I am doing.

Furthermore, I think it is rather obvious that this is the case if one just multiplies a diagonal matrix by itself.

Any hints as to a different approach?

Thanks,
Chris Maness

I think it's rather obvious too. But use index notation for the matrix product to prove it. You may also want to do a simple induction.
 
  • #3
Another quick thought. Since $$[det(D)]^{n}=det(D^{n})$$, would that not have to be the case?

Chris
 
  • #4
No, many different matrices have the same trace or determinant so neither of those is sufficient to show that the matrices are the same.
 
  • #7
So I need a Kronecker delta, correct? I imagine I would use both pi and sigma notation. If I finish this, it merely for the sake of having a better understanding of using pi and sigma notation, not because I will gain insight into $$D^{n}$$.

Chris
 
  • #8
kq6up said:
So I need a Kronecker delta, correct? I imagine I would use both pi and sigma notation. If I finish this, it merely for the sake of having a better understanding of using pi and sigma notation, not because I will gain insight into $$D^{n}$$.

Chris

I don't see any reason to make it THAT formal. Just comment that you must have ##i_0=i_1=i_2=...=i_n## and explain why it must be true.
 
  • #9
When I see nested sigmas, my eyes cross. I might be making it more difficult than it is. I have to picture the thing making the said matrix in my minds eye, and if the indices are more complex than the most straight forward examples, I have difficulty. Maybe I need some exercises in this type of operation that builds up in a moderate learning curve to build confidence and intuition. Any suggestions on a work sheet or something of that nature, or a reference to an on-line book.

Thanks,
Chris Maness
 
  • #10
kq6up said:
When I see nested sigmas, my eyes cross. I might be making it more difficult than it is. I have to picture the thing making the said matrix in my minds eye, and if the indices are more complex than the most straight forward examples, I have difficulty. Maybe I need some exercises in this type of operation that builds up in a moderate learning curve to build confidence and intuition. Any suggestions on a work sheet or something of that nature, or a reference to an on-line book.

Thanks,
Chris Maness

This one isn't that hard. The only term in that big sum that could be nonzero is when all the indices are equal.
 
  • #11
That is why I need the practice -- because I am so bad at seeing the obvious with this stuff :D

Chris Maness
 
  • #12
This problem is really messing with my head. It is obviously true, but I am completely at a loss as how to write the summation in a general way for an arbitrary power. I can do it for $$A^{2}$$, but not for an arbitrary power. I don't want the answer, I just need to get a better feel for doing general proofs with summation and/or product notation. I really don't have any experience with this.

Here is my solution for a diagonal matrix of ANY size multiplied to itself:

$${ \left[ { A }^{ 2 } \right] }_{ ij }=\sum _{ k=1 }^{ n }{ { A }_{ ik }{ A }_{ kj }{ \delta }_{ ik }{ \delta }_{ jk } } $$ where n is the size of the matrix.

Any hints appreciated.

Thanks,
Chris Maness
 
Last edited:
  • #13
Are you just trying to make this harder than it is?
##{ \left[ { A }^{ 2 } \right] }_{ ij }=\sum _{ k=1 }^{ n }{ { A }_{ ik }{ A }_{ kj } }##. That's it. No Kroneckers. It's the definition.
 
Last edited:
  • #14
Yes, I see that now because they are already zeros.

Chris
 
  • #15
kq6up said:
Yes, I see that now because they are already zeros.

Chris

Ok, so can you give an argument that shows what you want for ##D^2##?
 
  • #16
Ok I am going to go for the more general case here. Here is my new summation formula (don't laugh if it looks stupid -- I don't know what I am doing here):

$${ \left[ { A }^{ n } \right] }_{ ij }=\prod _{ l=1 }^{ n }{ \begin{bmatrix} { A }_{ 11 } & 0 & \cdots \\ 0 & \ddots & 0 \\ \vdots & 0 & { A }_{ ss } \end{bmatrix} } =\sum _{ k=1 }^{ s }{ \prod _{ l=1 }^{ n }{ { A }_{ kk } } } ={ A }_{ 11 }^{ n }+\cdots +{ A }_{ ss }^{ n }$$

1. If a diagonal matrix multiplies to any other matrix, all off diagonal elements are destroyed by zeros in the first diagonal matrix.

2. When diagonal matrices are multiplied elements in the same row/column are multiplied together.

3. If a diagonal matrix is multiplied to itself, each diagonal element is squared.

∴ A diagonal matrix to the nth power just raises each diagonal matrix to the nth power.

Does this work?

Thanks,
Chris Maness
 
  • #17
kq6up said:
Ok I am going to go for the more general case here. Here is my new summation formula (don't laugh if it looks stupid -- I don't know what I am doing here):

$${ \left[ { A }^{ n } \right] }_{ ij }=\prod _{ l=1 }^{ n }{ \begin{bmatrix} { A }_{ 11 } & 0 & \cdots \\ 0 & \ddots & 0 \\ \vdots & 0 & { A }_{ ss } \end{bmatrix} } =\sum _{ k=1 }^{ s }{ \prod _{ l=1 }^{ n }{ { A }_{ kk } } } ={ A }_{ 11 }^{ n }+\cdots +{ A }_{ ss }^{ n }$$

1. If a diagonal matrix multiplies to any other matrix, all off diagonal elements are destroyed by zeros in the first diagonal matrix.

2. When diagonal matrices are multiplied elements in the same row/column are multiplied together.

3. If a diagonal matrix is multiplied to itself, each diagonal element is squared.

∴ A diagonal matrix to the nth power just raises each diagonal matrix to the nth power.

Does this work?

Thanks,
Chris Maness

No, it doesn't work. Where did the summation come from? You are supposed to argue that ##(D^n)_{ij}=(D_{ij})^n## for D a diagonal matrix. Go back and look at the formula you gave for the matrix product and tell me why it's true for n=2.
 
  • #18
Does the argument work for A^2?

Chris
 
  • #19
kq6up said:
Does the argument work for A^2?

Chris

NO! Start from the formula in post 13 and say what happens if A is diagonal.
 
  • #20
If A is diagonal, then the diagonal elements are simply squared. I saw that a long time ago, I am just having trouble formalizing it.

Chris
 
  • #21
Try proving that the product of any two NxN diagonal matrices A and B is also a diagonal matrix, with i,ith diagonal element of the product matrix being given by the product of the corresponding diagonal elements of matrices A and B.

From there, it's an easy induction to show that (An)i,i is (Ai,i)n.
 
  • #22
kq6up said:
If A is diagonal, then the diagonal elements are simply squared. I saw that a long time ago, I am just having trouble formalizing it.

Chris

If A is diagonal then ##A_{ij}## is zero unless ##i=j##. That's what you need to formalize it.
What does the sum describing ##A^2## become in that case?
 
  • #23
Dick, the sum becomes: $${ \left[ { A }^{ 2 } \right] }_{ ii }=\sum _{ i=1 }^{ S }{ { A }_{ ii }^{ 2 } } $$

DH, this would be: $${ \left[ { A }B \right] }_{ ii }=\sum _{ i=1 }^{ S }{ { { { A }_{ ii }B }_{ ii } } } $$, and yes by induction it follows that $${ \left[ { A }^{ n } \right] }_{ ii }=\sum _{ i=1 }^{ S }{ { { A }_{ ii }^{ n } } } $$. Where S is the size of the matrix.

Is this ok, now?

Chris
 
  • #24
kq6up said:
Dick, the sum becomes: $${ \left[ { A }^{ 2 } \right] }_{ ii }=\sum _{ i=1 }^{ S }{ { A }_{ ii }^{ 2 } } $$

DH, this would be: $${ \left[ { A }B \right] }_{ ii }=\sum _{ i=1 }^{ S }{ { { { A }_{ ii }B }_{ ii } } } $$, and yes by induction it follows that $${ \left[ { A }^{ n } \right] }_{ ii }=\sum _{ i=1 }^{ S }{ { { A }_{ ii }^{ n } } } $$. Where S is the size of the matrix.

Is this ok, now?

Chris
No, this is not ok now.

You must not use i as both a specific index (e.g., how you used i on the left hand side) and as a dummy index of a sum (e.g., how you used i on the right hand side). What you wrote is not well formed.
 
  • #25
Is this better? $${ \left[ { A }^{ n } \right] }_{ ij }=\sum _{ i=1 }^{ S }{ { { A }_{ ij }^{ n } } } \quad When\quad i=j,\quad but\quad { A }_{ ij }=0\quad when\quad i\neq j.$$

Chris
 
  • #26
No! You are making the same mistake as before. You have i and j as free indices on the left. You can not reuse either one as a dummy index on the right. Your dummy index (i.e., the i in ##\sum_i##) must be something different than your free indices.
 
  • #27
Ahhh,$${ \left[ { A }^{ n } \right] }_{ ij }=\sum _{ k=1 }^{ S }{ { { A }_{ ik }^{ n } } } \quad When\quad i=k,\quad but\quad { A }_{ ik }=0\quad when\quad i\neq k.$$ Where A is a diagonal matrix.

Ok? Sorry for the test of patience. I am doing fine on all the problems that don't involve ∑. For some reason those throw me.

Chris
 
  • #28
kq6up said:
Ahhh,$${ \left[ { A }^{ n } \right] }_{ ij }=\sum _{ k=1 }^{ S }{ { { A }_{ ik }^{ n } } } \quad When\quad i=k,\quad but\quad { A }_{ ik }=0\quad when\quad i\neq k.$$ Where A is a diagonal matrix.

Ok? Sorry for the test of patience. I am doing fine on all the problems that don't involve ∑. For some reason those throw me.

Chris

This is really dragging on, and you don't seem the be getting it. Suppose A is 2x2 diagonal matrix. What you want to prove is (for example) ##(A^2)_{11}=(A_{11})^2##. I hope you agree with that. Notice there is NO SUMMATION on the right hand side!
 
  • #29
I get it, and I agree with what you have above. I am just having difficulty expressing it. Do you have a textbook that you can recommend that I can study out these types of proofs further? Proofs just aren't my strong suit. Did all the problems in section 12 with no problems (no proofs :D) as I was waiting on responses to this one, so I don't think I am stupid :D

Thanks,
Chris Maness
 
  • #30
kq6up said:
I get it, and I agree with what you have above. I am just having difficulty expressing it. Do you have a textbook that you can recommend that I can study out these types of proofs further? Proofs just aren't my strong suit. Did all the problems in section 12 with no problems (no proofs :D) as I was waiting on responses to this one, so I don't think I am stupid :D

Thanks,
Chris Maness

##{ \left[ { A }^{ 2 } \right] }_{ ij }=\sum _{ k=1 }^{ n }{ { A }_{ ik }{ A }_{ kj } }##. If i is not equal to j, then the sum is 0, isn't it? Why? If i=j, then what is it? There is no sum on the right side anymore. There is only one term.
 
  • #31
Yes, I see. The sum is gone altogether, so $$ { \left[ { { A }^{ 2 } } \right] }_{ ij }={ A }_{ ii }{ A }_{ ii }\quad When\quad i=j,\quad and\quad { \left[ { { A }^{ 2 } } \right] }_{ ij }=0\quad when\quad i\neq j $$.

Chris
 
  • #32
kq6up said:
Yes, I see. The sum is gone altogether, so $$ { \left[ { { A }^{ 2 } } \right] }_{ ij }={ A }_{ ii }{ A }_{ ii }\quad When\quad i=j,\quad and\quad { \left[ { { A }^{ 2 } } \right] }_{ ij }=0\quad when\quad i\neq j $$.

Chris

That's it! The nth power goes exactly the same way. There is at most one nonzero term in that big summation.
 
  • Like
Likes 1 person

FAQ: Power of a Diagonalized Matrix?

What is the power of a diagonalized matrix?

The power of a diagonalized matrix refers to raising the matrix to a certain exponent, such as 2 or 3. This is done by multiplying the matrix by itself the specified number of times.

How is a matrix diagonalized?

A matrix can be diagonalized by finding its eigenvalues and eigenvectors. The eigenvectors are then used to create a diagonal matrix, which is equivalent to the original matrix.

What is the significance of a diagonalized matrix?

A diagonalized matrix is significant because it simplifies calculations and makes it easier to find the powers of the matrix. It also reveals important information about the matrix, such as its eigenvalues and eigenvectors.

Can any matrix be diagonalized?

No, not all matrices can be diagonalized. A matrix can only be diagonalized if it has a full set of linearly independent eigenvectors.

How is the power of a diagonalized matrix calculated?

The power of a diagonalized matrix is calculated by raising each diagonal element to the specified power. This is because the diagonal elements represent the eigenvalues of the original matrix, and raising them to a power is equivalent to raising the original matrix to that power.

Similar threads

Replies
8
Views
2K
Replies
8
Views
769
Replies
4
Views
912
Replies
2
Views
2K
Replies
5
Views
10K
Replies
7
Views
2K
Back
Top