Linear algebra. Rank. linear independence.

In summary, given a finite dimensional vector space $V$ and a linear transformation $T$ on $V$ with eigenvalue $0$, a vector $v \in V$ has rank $r > 0$ with respect to eigenvalue $0$ if $T^rv=0$ but $T^{r-1}v\neq 0$. If $x,y \in V$ are linearly independent and have ranks $r_1$ and $r_2$ respectively, then the set $\{x,Tx,\ldots ,T^{r_1-1}x,y,Ty,\ldots , T^{r_2-1}y \}$ is linearly independent under certain
  • #1
caffeinemachine
Gold Member
MHB
816
15
Let $V$ be a finite dimensional vector space. Let $T$ be a linear transformation on $V$ with eigenvalue $0$. A vector $v \in V$ is

said to have rank $r > 0$ w.r.t eigenvalue $0$ if $T^rv=0$ but $T^{r-1}v\neq 0$. Let $x,y \in V$ be linearly independent and have

ranks $r_1$ and $r_2$ w.r.t eigenvalue $0$ respectively. Show that $\{x,Tx,\ldots ,T^{r_1-1}x,y,Ty,\ldots , T^{r_2-1}y \}$ is a

linearly independent set of vectors.

I can see that $\{ x, Tx, \ldots , T^{r_1-1}x \}$ are Linearly Independent, and $\{ y, Ty, \ldots, T^{r_2-1}y \}$ are linearly independent, but now I am stuck. Please help.
 
Physics news on Phys.org
  • #2
caffeinemachine said:
Let $V$ be a finite dimensional vector space. Let $T$ be a linear transformation on $V$ with eigenvalue $0$. A vector $v \in V$ is

said to have rank $r > 0$ w.r.t eigenvalue $0$ if $T^rv=0$ but $T^{r-1}v\neq 0$. Let $x,y \in V$ be linearly independent and have

ranks $r_1$ and $r_2$ w.r.t eigenvalue $0$ respectively. Show that $\{x,Tx,\ldots ,T^{r_1-1}x,y,Ty,\ldots , T^{r_2-1}y \}$ is a

linearly independent set of vectors.

I can see that $\{ x, Tx, \ldots , T^{r_1-1}x \}$ are Linearly Independent, and $\{ y, Ty, \ldots, T^{r_2-1}y \}$ are linearly independent, but now I am stuck. Please help.
As stated, this result is clearly false. For example, given $x$ satisfying those conditions for some $r_1>1$, you could take $y=Tx$ (and $r_2 = r_1-1$).

For the result to be true, you need some extra conditions, such as requiring that neither $x$ nor $y$ is in the range of $T$, and that $r_1\ne r_2$. Then the generalised eigenspaces generated by $x$ and $y$ will correspond to different Jordan blocks of $T$ and will therefore be linearly independent.
 

FAQ: Linear algebra. Rank. linear independence.

What is linear algebra?

Linear algebra is a branch of mathematics that deals with the study of linear equations and their representations using vectors and matrices. It is used to solve systems of linear equations and to analyze geometric transformations.

What is rank in linear algebra?

The rank of a matrix is the maximum number of linearly independent rows or columns in that matrix. It is a measure of the dimension of the vector space spanned by the columns or rows of the matrix. A matrix with full rank is one where all of its columns or rows are linearly independent.

What is linear independence?

Linear independence refers to the property of a set of vectors in a vector space where no vector can be expressed as a linear combination of the other vectors in the set. In other words, the vectors are not dependent on each other and cannot be reduced to a smaller set of vectors.

How do you determine linear independence?

To determine if a set of vectors is linearly independent, you can use the method of Gaussian elimination to reduce the matrix formed by the vectors to row-echelon form. If the reduced matrix has a pivot in every column, then the vectors are linearly independent.

What is the importance of linear algebra?

Linear algebra has a wide range of applications in various fields such as physics, engineering, computer science, and economics. It is also a fundamental tool in data analysis and machine learning. Understanding linear algebra is essential for solving complex problems involving linear equations and transformations.

Similar threads

Replies
12
Views
1K
Replies
4
Views
2K
Replies
24
Views
1K
Replies
10
Views
1K
Replies
7
Views
2K
Replies
1
Views
1K
Replies
9
Views
1K
Back
Top