Problem involving eigenvalues/vectors

  • Thread starter jacks0123
  • Start date
In summary, the problem deals with a geometric sequence of vectors where the nth term is obtained by multiplying the (n-1)th term with a 2x2 matrix. The question asks whether the sequence converges as n approaches infinity and provides conditions for convergence and the vector to which it converges. The formula for the sum of the first n vectors in the sequence is also discussed, along with its convergence conditions and the corresponding vectors. The concept of eigenvalues and eigenvectors is introduced and used to explain the conditions for convergence.
  • #1
jacks0123
3
0
Hi!
Please help me with this problem which must be solved using eigenvalues and eigenvectors:
A geometric sequence of vectors (2x1 row vectors) where to get from one term to the next multiply by a matrix (2x2):
t =(R^(n-1))*a
Where:
t is the nth vector in the sequence
R is the 2x2 matrix
R=
[a b]
[c d]


1.Does t converge as n->infinity? What conditions are sufficient for the sequence to converge? What vector does tn converge in each case?

2.What is the formula for the sum of the first n vecotrs in this sequence? Under what conditions and to what vectors does it converge

Thanks!
 
Last edited:
Physics news on Phys.org
  • #2
Welcome to PF, jacks0123! :smile:

For starters, is it possible that your condition should be ((a+d)^2)-4detR>0?

Did you try anything?
How far did you get?

Can you say anything about the eigenvalues and eigenvectors of R based on the condition ((a+d)^2)-4detR>0?

Can you diagonalize R?
 
  • #3
First, answer the questions if the first vector in the sequence is an eignevector of R. (Hint: the answers will depend on the corresponding eigenvalue).

Then, think how you can use those answers for an arbitrary starting vector.
 
  • #4
I do not understand how to do question 1 at all.

My friend showed me how but i do not understand what he is talking about. Could someone explain this in more detail?

Suppose you decompose a into its eigenvector components , say a= k1e1+k2e2 where e1 and e2 are the eigenvectors, then you apply R to it many times. The e1 component will blow up to infinity if abs(k1)>1 and similarly for e2. So for convergence we have the following alternatives:
(a) a=0 obviously never changes
(b) abs(k2)<1, then it converges to zero if abs(k1)<1 and it converges to e1 if k1=1
(c) abs(k1)<1, then it converges to zero if abs(k2)<1 and it converges to e2 if k2=1
 
  • #5
Suppose the eigenvalues are t1 and t2, then you need to replace abs(k1) by abs(t1), and abs(k2) by abs(t2).

This is because:

R a = R (k1e1 + k2e2) = k1 (R e1) + k2 (R e2) = k1 t1 e1 + k2 t2 e2

R^2 a = R (k1 t1 e1 + k2 t2 e2) = k1 t1^2 e1 + k2 t2^2 e2

R^n a = k1 t1^n e1 + k2 t2^n e2

So the blowing up is with t1 and t2.
If either abs(t1) or abs(t2) is greater than 1, the result blows up.
 
Last edited:
  • #6
Im a bit confused. What is t and what is k? How did you get from
k1 (R e1) + k2 (R e2)
to
k1 t1 e1 + k2 t2 e2
 
  • #7
Just got up. :zzz:

jacks0123 said:
Im a bit confused. What is t and what is k? How did you get from
k1 (R e1) + k2 (R e2)
to
k1 t1 e1 + k2 t2 e2

k1 and k2 are defined by the decomposition of "a" into the eigenvectors e1 and e2.
Any 2D vector can be decomposed into a linear combination of 2 independent vectors.

And oh, I meant t1 and t2 to be the eigenvalues of R.
I'll edit my previous post to match.
This means R e1 = t1 e1 since that is the definition of an eigenvalue and its eigenvector.
 
Last edited:
  • #8
Hi guys stuck with same problem...i still don't get what k1 and k2 are. what do you mean by the decomposition of "a" into the eigenvectors e1 and e2. please reply fast
 
  • #9
2 independent vectors form a basis for ℝ2.
Any 2D vector can be decomposed as a linear combination of the (independent) vectors in a basis.

Key to this problem is that there are 2 independent eigenvectors.
The condition given guarantees that, although that is still something that you would need to proof.
 
  • #10
jack201 said:
Hi guys stuck with same problem...i still don't get what k1 and k2 are. what do you mean by the decomposition of "a" into the eigenvectors e1 and e2. please reply fast

Think of it like decomposing 3D space into x,y,z vectors, where in this case eigen-vector decomposition does that for a particular matrix: it gets the linearly independent basis vectors of your matrix and hence "decomposes" it into basis vectors (in a similar way you decompose 3D space into x,y,z vectors).
 

FAQ: Problem involving eigenvalues/vectors

What is an eigenvalue/vector?

An eigenvalue is a scalar value that represents the amount by which a vector is scaled when multiplied by a given matrix. An eigenvector is a vector that, when multiplied by a given matrix, yields a multiple of itself, i.e. it only changes in scale.

Why are eigenvalues/vectors important in problem solving?

Eigenvalues and eigenvectors are important in problem solving because they help us understand how a matrix transforms a vector. They also have various applications in fields such as physics, engineering, and economics.

3. How do you find eigenvalues/vectors?

To find eigenvalues and eigenvectors, we first need to set up and solve an equation called the characteristic equation. This involves finding the determinant of the original matrix and setting it equal to zero. The eigenvalues are the solutions to this equation, and the corresponding eigenvectors can be found by plugging in these eigenvalues into the original matrix and solving for the eigenvectors.

4. Can a matrix have more than one eigenvalue/vector?

Yes, a matrix can have multiple eigenvalues and eigenvectors. In fact, most matrices have multiple eigenvalues and corresponding eigenvectors. However, there are some special matrices, such as identity matrices, that only have one eigenvalue (which is equal to the dimension of the matrix) and one eigenvector.

5. How are eigenvalues/vectors used in data analysis?

In data analysis, eigenvalues and eigenvectors are used in a technique called Principal Component Analysis (PCA). This involves finding the eigenvalues and eigenvectors of a data matrix to identify the most important features or components of the data. This can help with data visualization and dimensionality reduction.

Back
Top