Minimal Polynomials & Diagonalization: P_2(\mathbb{C}) & M_{k x k}(\mathbb{R})

In summary, we discussed the computation of minimal polynomials for two operators, T and T(A), where T is a linear operator on the vector space P_2(\mathbb{C}) and T(A) is a linear operator on the vector space M_{k \times k}(\mathbb{R}). We also determined that T is diagonalizable and found the minimal polynomial for T(A) to be p(t) = 2 + t. However, there was some confusion regarding the notation (Tf)(x) and the identity matrices used in the computation of the characteristic polynomial for T(A). Ultimately, it was determined that T can be represented by a k² x k² matrix and T(A) can be represented by
  • #1
AKG
Science Advisor
Homework Helper
2,567
4
Compute the minimal polynomials for each of the following operators. Determine which of the following operators is diagonalizable.

a) [itex]T : P_2(\mathbb{C}) \to P_2(\mathbb{C})[/itex], where:

[tex](Tf)(x) = -xf''(x) + (i + 1)f'(x) - 2if(x)[/tex].

b) Let [itex]V = M_{k \times k}(\mathbb{R})[/itex].

[tex]T : V \to V[/itex] by [itex]T(A) = -2A^t[/tex].

For (a), I think the notation (Tf)(x) is confusing me a little. Do they mean T(f(x))? If f = ax² + bx + c, am I right in saying that:

(Tf)(x) = x²(-2ia) + x(2i)(a - b) + (b + i(b - 2c))?

For (b), I start by finding the characteristic polynomial of T. Let B be the matrix representation of T with respect to some ordered basis. Then, the characteristic polynomial of T is:

g(t) = det(B - tI)
g(t) = det(BI + B(½tI))
g(t) = det(B((1 + ½t)I))
g(t) = det((1 + ½t)BI)
g(t) = det((1 + ½t)(-2I))
g(t) = det(-(2 + t)I)
g(t) = (-1)ⁿ(2 + t)ⁿ, where n = dim(V)

I'm not sure whether n = k or n = k². Now, either way, the minimal polynomial of T is the same as the minimal polynomial of B, which will be some power of (2 + t). Let's try the first power, so the minimal polynomial is:

p(t) = 2 + t

Then:

p(B) = 2I + B = 2I + BI = 2I - 2I = 0, so the first power seems right.

Now, for some [itex]A \in V, A \neq A^t[/itex], we have:

[tex]0 = p(B)(A) = (2I + B)A = 2A + BA = 2A - 2A^t = 2(A - A^t) \neq 0,[/tex]

a contradiction. Where did I go wrong?
 
Physics news on Phys.org
  • #2
What is [itex]P_2(\mathbb{C})[/itex]? I'm going to assume it means complex quadratic polynomials.


Notice the domain and range of T: it takes a polynomial and spits out another polynomial. The parentheses are correct: you compute Tf first. Tf is a polynomial, so you can evaluate it at x: (Tf)(x).


Your problem for (b) is that you got your identy matrices confused. In:

g(t) = det(B - tI)

I is the identity operator on V, a k^4 dimensional space. However, you treated it as if it was the identity matrix in V, a k^2 dimensional space, when you went to the next line.

(BTW, did you remember to check that T was linear before you assumed it was a matrix?)


I think diagonalizing first is a better approach for this one. I thought it was fairly easy to find k^2 linearly independent eigenvectors of T. Then, once you have the eigenvalues, you can write down the minimum polynomial.
 
Last edited:
  • #3
Hurkyl said:
Your problem for (b) is that you got your identy matrices confused. In:

g(t) = det(B - tI)

I is the identity operator on V, a k^4 dimensional space.
I think you mean k²-dimensional.
I think diagonalizing first is a better approach for this one. I thought it was fairly easy to find k^2 linearly independent eigenvectors of T. Then, once you have the eigenvalues, you can write down the minimum polynomial.
I checked that T was linear, so there should be a matrix representation of it. You make it seem that the matrix should be a (k² x k²)-matrix. If B is this matrix, then:

T(v) = Bv for all v in V. But this means that we're multiplying a (k² x k²)-matrix, B, by a (k x k)-matrix v, which isn't possible.

Another thing: assuming all else was right, and I found g(t) correctly (with n = k² according to you), then the rest should still hold, and I should still get that contradiction.
 
  • #4
Basically, T is linear, so it should have a matrix representation, and if v is in V, then T(v) = Bv, so B must be an (k x k)-matrix. However, when we usually have operators over n dimensional spaces, their matrices are (n x n)-matrices, and since we have a k²-dimensional space here, we would expect a (k² x k²)-matrix. Now, is there any reason why the fact that we don't have such a matrix problematic? Do we really get any contradictions, or is it just unusual? As far as I can tell, it isn't really a problem, what do you think?
 
  • #5
First thing you have to notice for (b):

V, the vector space of all kxk matrices over R, is a k^2 dimensional vector space.

T is a linear operator on V, so it does have a matrix representation, which you can get by selecting a basis for V...

However, just like for any other vector space, you also have to write the elements of V in terms of the basis vectors -- so if you're writing T as a matrix, you have to write elements of V as k^2-tuples, and T would indeed be written as a k^2 x k^2 matrix.
 
  • #6
Hurkyl said:
you have to write elements of V as k^2-tuples
Oh, perfect. Thanks!
 

FAQ: Minimal Polynomials & Diagonalization: P_2(\mathbb{C}) & M_{k x k}(\mathbb{R})

What is a minimal polynomial?

A minimal polynomial is the monic polynomial of lowest degree that has a given matrix or element as a root. In other words, it is the smallest polynomial that can be used to express a matrix or element as a linear combination of powers of itself.

How is a minimal polynomial related to diagonalization?

A minimal polynomial plays a crucial role in the process of diagonalization. It is used to determine the eigenvalues and eigenvectors of a matrix, which are necessary for diagonalizing a matrix. The minimal polynomial also helps in finding the diagonal matrix that is similar to the original matrix.

What is the significance of P_2(\mathbb{C}) & M_{k x k}(\mathbb{R}) in minimal polynomials and diagonalization?

P_2(\mathbb{C}) is the set of all 2x2 complex polynomials, while M_{k x k}(\mathbb{R}) represents the set of all kxk real matrices. These sets are important in minimal polynomials and diagonalization because they contain the elements and matrices for which the concepts of minimal polynomial and diagonalization are applicable.

Can a matrix have multiple minimal polynomials?

Yes, a matrix can have multiple minimal polynomials. This can happen when a matrix has repeated eigenvalues. In such cases, there can be multiple polynomials of the same degree that have the matrix as a root.

How can minimal polynomials and diagonalization be used in practical applications?

Minimal polynomials and diagonalization are important concepts in linear algebra, and they have various applications in fields such as engineering, physics, and computer science. They are used in solving systems of linear differential equations, analyzing circuits, and computing eigenvectors and eigenvalues of large matrices, to name a few examples.

Back
Top