Can a theorem simplify finding eigenvalues of a block matrix?

In summary, the eigenvalues and eigenvectors of a matrix of the form \left ( \begin{array}{cc}X_1 & X_2 \\X_2 & X_1\end{array}\right )where the X_i's are themselves M \times M matrices of the formX_i = x_i \left ( \begin{array}{cccc}1 & 1 & \cdots & 1 \\1 & 1 & \cdots & 1 \\\vdots & \vdots & \ddots & \vdots \\1 & 1
  • #1
goulio
15
0
I need to find the eigenvalues and eigenvectors of a matrix of the form
[tex]
\left ( \begin{array}{cc}
X_1 & X_2 \\
X_2 & X_1
\end{array} \right )
[/tex]
where the [itex]X_i[/itex]'s are themselves [itex]M \times M[/itex] matrices of the form
[tex]
X_i = x_i \left ( \begin{array}{cccc}
1 & 1 & \cdots & 1 \\
1 & 1 & \cdots & 1 \\
\vdots & \vdots & \ddots & \vdots \\
1 & 1 & \cdots & 1
\end{array} \right )
[/tex]
Is there any theroem that could help? Something like if you find the eigenvalues of the [itex]X_i[/itex]'s then the eigenvalues of the block-matrix are...

Thanks
 
Physics news on Phys.org
  • #2
Yours are not "circulant" matrices, but they are sort of similar. Maybe you will get some ideas by learning about circulant matrices on Wikipedia (and it never hurts to learn a little more about matrices):

http://en.wikipedia.org/wiki/Circulant_matrix

Carl
 
  • #3
yeah ther's a theorem...it was part of my dynamical systems course, actually you shouhld learn it in a ODE class. sorry but i don't ahve my text near by
its something to do with nilpotent if i recall correctly.
 
  • #4
I found out that the matrix can be rewritten as
[tex]
\left ( \begin{array}{cc}x_1 & x_2 \\
x_2 & x_1
\end{array} \right ) \otimes
\left ( \begin{array}{cccc}1 & 1 & \cdots & 1 \\
1 & 1 & \cdots & 1 \\
\vdots & \vdots & \ddots & \vdots \\
1 & 1 & \cdots & 1 \end{array} \right )
[/tex]
So I now need to prove that the determinant of matrix filled with ones minus [itex]\lambda I[/itex] is
[tex]
(-1)^M \lambda^{M-1}(\lambda - M)
[/tex]
Any ideas?
 
  • #5
You cna find that determinant quite easily using row reductions, though since you only want to find eigenvectors and eigenvalues (and that is simple in this case) that is unnecessary
 
  • #6
I tried evaluating the eigenvectors of the matrix filled with ones for M=6 in mathematica and here's what I get :

{1, 1, 1, 1, 1, 1},
{-1, 0, 0, 0, 0, 1},
{-1, 0, 0, 0, 1, 0},
{-1, 0, 0, 1,0, 0},
{-1, 0, 1, 0, 0, 0},
{-1, 1, 0, 0, 0, 0}}

The first one corresponds t0 [itex]\lambda = M[/itex] eigenvalue and the others to [itex]\lambda = 0[/itex], but they're not orthogonal with each other they are only orthogonal with the first one! I know I could try to do linear combinations of those vectors but in the case where M is very large this becomes a bit confusing...

Any ideas?
 
  • #7
Why do you want something to be orthogonal to something else, who has even said that we're working with a vector space over a field with a nondegenerate inner product?

By inspection the matrix has either 0 1 or 2 dimensional image. 0 if a=b=0, 1 if a=b, two otherwise (row reduce).

In any case you can use what you just did to work out the eigenvectors not in the kernel and the eigenvectors that are killed.

Hint, split a vector with 2M entries in half: (1,-1,0,...,0)

is certainly klled by thematrix, as is (0,..,0,1,-1,0,..0) where there are M zeroes before the 1
 
Last edited:
  • #8
I think the answer is here:
http://cellular.ci.ulsa.mx/comun/algebra/node65.html
Basically,
for some $PX = \lambda X$ and $QY = \mu Y$ the following holds:

$$(P \otimes Q)(X \otimes Y) = (\lambda\mu)(X \otimes Y)$$.

This I guess implies that $(X \otimes Y)$ are the eigenvectors of $(P \otimes Q)$ (by definition) and $(\lambda\mu)$ are its eigenvalues. $M$ should be the only non-zero eigenvalue of $M \times M$ matrix of all ones (by Gershgorin theorem). Eigenvalues of the small matrix consisting of $x$s can be found in closed form solving associated quadratic.
 
Last edited by a moderator:

FAQ: Can a theorem simplify finding eigenvalues of a block matrix?

What is a block matrix eigenvalue?

A block matrix eigenvalue is a special type of eigenvalue that corresponds to a block matrix. A block matrix is a matrix that is divided into smaller submatrices, also known as blocks. The eigenvalues of a block matrix are the solutions to the characteristic equation of the matrix, and they can be found by solving for the roots of the characteristic polynomial.

How is a block matrix eigenvalue calculated?

The calculation of a block matrix eigenvalue involves finding the eigenvalues of each individual block in the matrix and then combining them to form the eigenvalues of the entire block matrix. This can be done using various methods, such as using the Schur decomposition or the block diagonalization method.

What is the significance of block matrix eigenvalues?

Block matrix eigenvalues are important in many areas of mathematics and science, particularly in linear algebra and matrix theory. They are used in applications such as signal processing, control theory, and quantum mechanics. Additionally, they can provide insights into the structure and properties of block matrices, which have many practical applications.

Can a block matrix have complex eigenvalues?

Yes, a block matrix can have complex eigenvalues. This is because the eigenvalues of a matrix are determined by its characteristic polynomial, and a polynomial can have complex roots. However, in some cases, a block matrix may only have real eigenvalues, depending on the properties of its individual blocks.

How are block matrix eigenvalues related to the eigenvalues of its individual blocks?

The eigenvalues of a block matrix are related to the eigenvalues of its individual blocks through a process called block diagonalization. This involves transforming the block matrix into a diagonal matrix by finding a basis in which the matrix is diagonalizable. The resulting diagonal matrix will have the eigenvalues of the individual blocks along its diagonal, and the remaining elements will be zero.

Similar threads

Back
Top