Understanding: Eigenvalues & Eigenvectors/Diagonalizing

  • Thread starter ahmed markhoos
  • Start date
  • Tags
    Eigenvalues
In summary: What do you think the problem might be?The problem might be that you are not familiar with the basics of matrix theory.
  • #1
ahmed markhoos
49
2
Hello,

I'm having problem understanding this particular part, don't know it seems too dry and behind my capabilities of imagining the problems!, in the same time I feel like there is too many gaps in the way that the book explain the subject.

I'm using "Mathematical methods in the physical sciences by mary boas"

is there any useful references or youtube lectures you can suggest for me?
 
Physics news on Phys.org
  • #2
We can't help unless you try to dive in the problem and tell us which part of the chapter you can't grasp. For prelim, do you know what matrix is and what operations exist among matrices?
 
  • #3
you know it's in the end a methods book not a pure mathematical book, the problem is that I think somehow that there is a messing details in the section I'm reading "which is on eigenvalues & eigenvectors; diagnolizing matrices "

Ok, the same book I mentioned chapter 3 section 11. And yes I know what is matrix and what is operations.
 
  • #5
What do you know about eigenvectors and eigenvalues? The basic definition of 'eigenvalue' and 'eigenvector' is that [itex]\vec{v}[/itex] is an eigenvector of linear transformation A, corresponding to eigenvalue [itex]\lambda[/itex] if and only if [itex]A\vec{v}= \lambda\vec{v}[/itex] so that, in its simplest sense, A simply acts like multiplication by [itex]\lambda[/itex] when applied to [itex]\vec{v}[/itex]. Of course, just one vector acting like that wouldn't be very usefulTo but it is easy to prove "The set of all eigenvectors corresponding to eigenvalue [itex]\lambda[/itex] form a subspace: If [itex]\vec{u}[/itex] and [itex]\vec{v}[/itex] are both eigenvectors of A corresponding to the same eigenvalue, [itex]\lambda[/itex] then, for any numbers a and b, [itex]a\vec{u}+ b\vec{v}[/itex] is also an eigenvector.

An important result of that is: "If we can find a basis for the vector space consisting entirely of eigenvectors of A, then A, written as a matrix using that particular basis, is a diagonal matrix with its eigenvalues on the diagonal". To see that you need to recognize that if we apply any matrix, M, to the basis vectors of the vectors space, the result gives the columns of M. That is, if [itex]Me_i= a_1e_1+ a_2e_2+ \cdot\cdot\cdot+ a_ne_n[/itex] then the ith column of the matrix must be [itex]\begin{bmatrix}a_1 \\ a_2 \\ \cdot\cdot\cdot \\ a_n\end{bmatrix}[/itex]. To see that recognize that [itex]e_i[/itex] would be written as a column with all "0"s except for a "1" in the ith place so that when we multiply by each row in the matrix M, we have only the number in the ith row of the column.

However, not every linear transformation has that property. That is, not every matrix can be "diagonalized". If all eigenvalues are different then the corresponding eigenvectors must be independent so there exist a basis of eigevectors. Even if there are not n different eigenvalues, eigenvectors corresponding to the same eigenvalue might be independent but not always.
 
Last edited by a moderator:
  • Like
Likes ahmed markhoos
  • #6
What is the practical importance of "Eigenvalues" and Eigenvectors" ? Can somebody please explain them clearly as to what they indicate and how they were invented.
 
  • #7
Eigenvalues and eigenvectors represent the fundamental modes of a linear system.

It helps to consider some physics systems. For instance when studying the hydrogen atom in quantum mechanics there is a linear operator for the energy. The eigenvectors of this operator give you the electron orbitals, and the eigenvalue gives you the energy associated with a particular orbital.

In fluid dynamics you can derive sound waves from studying the properties of a linear operator. The eigenvectors of this operator give you information as to how the wave propagates, and the eigenvalues gives you the speed of sound.
 
  • Like
Likes Mayank Totloor
  • #8
If a linear transformation, from an n-dimensional vector space to itself, has a "complete set of eigenvectors", that is n independent eigenvectors, then using those eigenvectors as basis vectors the linear transformation can be written as a diagonal matrix with the eigenvalues on the main diagonal.

A diagonal matrix is particularly easy to work with. In particular a diagonal matrix is invertible if and only none of the numbers on its diagonal (its eigenvalues) are 0 and then its inverse matrix is the diagonal matrix with the reciprocals of the diagonal numbers of the original matrix on its diagonal.
 
  • Like
Likes Mayank Totloor

FAQ: Understanding: Eigenvalues & Eigenvectors/Diagonalizing

What are eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are concepts in linear algebra that are used to understand the behavior of linear transformations. Eigenvalues represent the scaling factor of the eigenvector when the linear transformation is applied.

How are eigenvalues and eigenvectors calculated?

Eigenvalues and eigenvectors can be calculated using a variety of methods, such as the characteristic polynomial method or the power iteration method. The most common method involves solving the characteristic equation for the eigenvalues, and then finding the corresponding eigenvectors.

What is the importance of diagonalizing a matrix?

Diagonalizing a matrix involves finding a new basis in which the matrix is represented by a diagonal matrix. This is important because it simplifies the matrix and makes it easier to perform calculations and understand the behavior of the linear transformation represented by the matrix.

Is diagonalization always possible?

No, diagonalization is only possible for certain types of matrices, such as square matrices with distinct eigenvalues. Matrices with repeated eigenvalues or non-square matrices cannot be diagonalized.

How is diagonalization used in real-world applications?

Diagonalization is used in various fields, such as physics, engineering, and computer science, to solve problems involving linear transformations. For example, it can be used to model the behavior of a system over time or to compress and manipulate data in image and signal processing applications.

Back
Top