Prove Rk(A+B) ≤ Rk(A) + Rk(B) - Tal

In summary, the conversation discusses a question about matrices and proving that Rk(A+B) is less than or equal to Rk(A) + Rk(B). The attempt at a solution involves using the dimension of the row spaces of A and B, and the possibility of also solving the problem using linear transformations. The question asker is seeking clarification on whether their approach is correct.
  • #1
talolard
125
0
Hey Guys, Another matrice question

Homework Statement


Prove: Rk(A+B)[tex]\leq[/tex] Rk(A) +Rk(B)



The Attempt at a Solution



Rk(A+B) = Dim[R(A) + R(B)]
Where R(A) is the row space of A
we know that Dim[R(A)+R(B)] = Dim[R(A)] + Dim[R(B)] - Dim[R(A)[tex]\cap[/tex]R(B)]
Which means that Dim[R(A)+R(B)] [tex]\leq[/tex] Dim[R(A)] + Dim[R(B)] iff Rk(A+B)[tex]\leq[/tex] Rk(A) +Rk(B)

I heard a rumor that this can also be done with linear transformations, can anyone elighten me on that path?

Is this correct?
Thanks
Tal
 
Physics news on Phys.org
  • #2
talolard said:
Hey Guys, Another matrice question

Homework Statement


Prove: Rk(A+B)[tex]\leq[/tex] Rk(A) +Rk(B)



The Attempt at a Solution



Rk(A+B) = Dim[R(A) + R(B)]
Where R(A) is the row space of A
we know that Dim[R(A)+R(B)] = Dim[R(A)] + Dim[R(B)] - Dim[R(A)[tex]\cap[/tex]R(B)]
Which means that Dim[R(A)+R(B)] [tex]\leq[/tex] Dim[R(A)] + Dim[R(B)] iff Rk(A+B)[tex]\leq[/tex] Rk(A) +Rk(B)

I heard a rumor that this can also be done with linear transformations, can anyone elighten me on that path?
If F is a linear transformation from U to V, then, given specific bases for U and V, there exist a matrix representing F so essentially we can interpret matrices as being linear transformations and vice versa. Any thing true of matrices is true of linear transformations.

Is this correct?
Thanks
Tal
 

FAQ: Prove Rk(A+B) ≤ Rk(A) + Rk(B) - Tal

What does "Prove Rk(A+B) ≤ Rk(A) + Rk(B) - Tal" mean?

This statement means that the rank of the sum of two matrices, A and B, is less than or equal to the sum of the ranks of A and B minus the number of linearly independent columns in the intersection of A and B.

What is the significance of proving this statement?

Proving this statement helps us understand the relationship between the ranks of matrices and the number of linearly independent columns in their intersection. It also has applications in fields such as linear algebra, statistics, and data analysis.

How do you prove this statement?

This statement can be proven using the Rank-Nullity Theorem, which states that the rank of a matrix is equal to the number of linearly independent columns in the matrix. Additionally, properties of matrix operations and linear independence can be applied to prove this statement.

Can you provide an example to illustrate this statement?

Sure, let's say we have two matrices, A and B, with ranks of 3 and 4 respectively. If the intersection of A and B has 2 linearly independent columns, then the rank of A+B would be 3+4-2=5. This satisfies the statement, as 5 is greater than or equal to 3+4-2=5.

What other related statements or theorems are important to know in this context?

Some other related statements or theorems include the Rank-Nullity Theorem, the Rank-Nullity Inequality, and the Sylvester's Rank Inequality. These all have implications on the ranks of matrices and their intersections.

Similar threads

Back
Top