Linear Algebra - Eigenvalue Problem

In summary, the question asks for the representation of a linear operator A as a square matrix in terms of three basis vectors |a>, |b>, and |c>. By using the given properties of A, we can construct a 3 by 3 matrix with specific elements and then relate it to the correct matrix by comparing the results from multiplying the basis vectors by A. This is possible because the basis vectors are orthogonal and can be used to define a coordinate system.
  • #1
chill_factor
903
5

Homework Statement



Let there be 3 vectors that span a space: { |a>, |b>, |c> } and let n be a complex number.

If the operator A has the properties:

A|a> = n|b>
A|b> = 3|a>
A|c> = (4i+7)|c>

What is A in terms of a square matrix?

Homework Equations



det(A-Iλ)=0

The Attempt at a Solution



I don't even know how to start. Can someone give me a starting hint?
 
Physics news on Phys.org
  • #2
Imagine |a>=(1,0,0), |b>=(0,1,0), |c>=(0,0,1)
What would the square matrix be then?
Once you've got that, how can you relate your "hypothetical" square matrix to the correct one?
 
  • #3
Since A maps a three dimensional vector space to itself, it can be represented as a 3 by 3 matrix and so can be written
[tex]\begin{bmatrix}a & b & c \\ d & e & f \\ g & h & i\end{bmatrix}[/tex]

Since you have basis vectors |a>, |b>, and |c>, they can be written as <1, 0, 0>, <0, 1, 0>, and <0, 0, 1>, respectively. What do you get when you multiply each of those by the matrix above?

You are told that "A|a> = n|b>", that is the what you got by multiplying A by <1, 0, 0> must be equal to <0, n, 0> for this complex number n. Compare the two.

Similarly, you are told that "A|b> = 3|a>". In other words, what you got by mutiplying A by <0, 1, 0> must be equal to <3, 0, 0>.

Finally, you are told that "A|c> = (4i+7)|c>" so that what you got by multiplying A by <0, 0, 1> must be equal to <0, 0, 4i+7>.
 
  • #4
Ok, I didn't think of that. Writing them out in terms of specific numbers is helpful. However, why can we do that?

I think I understand why they were written that way a little. Is it because they are orthogonal elements of a Hilbert Space and since they are orthogonal, we can define a coordinate system such that one is <1,0,0> and the other 2 are necessarily orthogonal to it and others, following the right hand rule?
 

FAQ: Linear Algebra - Eigenvalue Problem

1. What is an eigenvalue problem in linear algebra?

An eigenvalue problem in linear algebra involves finding the eigenvalues and corresponding eigenvectors of a given matrix. Eigenvalues are special numbers that represent how a matrix stretches or compresses a vector, while eigenvectors are the corresponding vectors that remain in the same direction after being multiplied by the matrix.

2. Why is solving eigenvalue problems important in linear algebra?

Solving eigenvalue problems is important in linear algebra because it helps in understanding the properties of a matrix and how it affects vectors. This is useful in various applications, such as data analysis, machine learning, and image processing.

3. What is the process for solving an eigenvalue problem?

The process for solving an eigenvalue problem involves first finding the characteristic polynomial of the matrix, then using this polynomial to find the eigenvalues, and finally finding the corresponding eigenvectors using these eigenvalues.

4. Can eigenvalues be complex numbers?

Yes, eigenvalues can be complex numbers. This is because the characteristic polynomial of a matrix can have complex coefficients, resulting in complex eigenvalues. However, if the matrix is real and symmetric, its eigenvalues will always be real numbers.

5. How are eigenvalues and eigenvectors used in real-world applications?

Eigenvalues and eigenvectors are used in real-world applications to understand the behavior of complex systems. For example, in physics, they are used to analyze the behavior of vibrating systems, while in engineering, they are used in structural analysis. They are also used in data analysis to identify patterns and reduce the dimensionality of large datasets.

Back
Top