Proof of linear independence and dependence

In summary: . but it doesn't say that every column has to be a linear combination. it just says that at least one column can be written that way. no it doesn't that statement doesn't even make sense.
  • #1
andorrak
30
0
1. Homework Statement

There are two proofs:

Let X and Y be two matrices such that the product XY is defined. Show that if the columns of Y are linearly dependent, then so are the columns of the matrix XY.


Let X and Y be two matrices such that the product XY is defined. Show that if the columns of the matrix XY are linearly independent, then so are the columns of Y


2. Homework Equations
N/A


3. The Attempt at a Solution

Solution for first. i do not know

Solution for second one perhaps?

If XY are assumed to be the identity matrix. Thus we know I = (A^-1)(A)

Therefore, A=X and A^-1=Y. Then we know X^-1=Y. Then by the invertible matrix theorem, the equation Ax=0 has only the trivial solution and must be linearly independent?

Those are my two cents, can anyone help me?
 
Physics news on Phys.org
  • #2
for the first part start with
[tex]
Y = \begin{pmatrix} y_1 & .. & y_i .. & y_n \end{pmatrix}
[/tex]

[tex] X = \begin{pmatrix} x_1^T \\ .. \\ x_j^T \\.. \\ x_m^T \end{pmatrix}
[/tex]

now assume some y_i = a.y_r + b.ys (ie a linearly dependent column vectro for some r,s) and consider the action of the multiplication (each element in XY will correspond to a dot product between the row vector of X and the column vectros of Y)

so what is x^T dot ( a.y_r + b.ys )?
 
Last edited:
  • #3
Um could you elaborate? I have no idea what you are saying. sorry
 
  • #4
i re-wrote above for clarity & expanded a little
 
  • #5
Sorry to sound nagging.

But I still have no idea what you are saying. Or what significance it brings. I am only on chapter 2 of the linear algebra book by Lay. I assumed it had something to do with the invertible matrix theorem
 
  • #6
try expaninding this (for arbitrary vectors)

x^T dot ( a.y_r + b.ys )?
 
  • #7
what is the dot product? if you could answer it systematically it would be very helpful
 
  • #9
O that's just matrix multiplication. but i do not understand your notation, ie this.

try expaninding this (for arbitrary vectors)

x^T dot ( a.y_r + b.ys )?

so i assume x^T is the column for the X matrix. and r and s are the dependent vectors and the those dots between a.y and b.y i assume then are the dot products but where are the a's coming from? the entries of the x matrix?
 
  • #10
x_i^T is a row vector of X

say a column vector y_j can be written as a linear combination of other coulmn vectors y_r, y_s, for a,v constants

[tex] y_j = ay_r + a y_s [/tex]

[tex] x_i^T \bullet y_j = x_i^T \bullet (a y_r + b y_s) [/tex]
 
Last edited:
  • #11
so consider the jth column of XY
[tex]
XY_j = \begin{pmatrix} x_1^T \bullet y_j\\ .. \\ x_i^T \bullet y_j\\.. \end{pmatrix}

[/tex]
 
  • #12
OH i think i get it. because you assume all teh columns of Y are linearly dependent you can show (which is what a and b represent) are t9eh coefficients when you have a linearly combination between the sets. Now when you multiply X^T with it, that is simply a scalar multiple of the one before so it IS STILL a linearly dependent vector.

Correct me if I am wrong. thanks!
 
  • #13
pretty much, however i would say all you need to assume is there is a linearly dependent column in Y, so you may want to generalise a bit
 
  • #14
Yea but the question assumes every column in Y is linearly dependent so I am set. thanks again !
 
  • #15
andorrak said:
Yea but the question assumes every column in Y is linearly dependent so I am set. thanks again !

no it doesn't that statement doesn't even make sense

it says "the columns of Y are linearly dependent" this means at least one cloumn can be written as a linear combination of the others
 

FAQ: Proof of linear independence and dependence

What is the difference between linear independence and dependence?

Linear independence refers to a set of vectors in a vector space that cannot be written as a linear combination of each other. This means that none of the vectors can be expressed as a multiple of another vector. Linear dependence, on the other hand, refers to a set of vectors that can be written as a linear combination of each other. This means that at least one of the vectors can be expressed as a multiple of another vector.

How do you determine if a set of vectors is linearly independent or dependent?

A set of vectors is considered linearly independent if the only solution to the equation c1v1 + c2v2 + ... + cnvn = 0 is the trivial solution, where all coefficients ci are equal to 0. If there exists a non-trivial solution where at least one coefficient is not equal to 0, then the set of vectors is linearly dependent.

Can a set of two vectors be linearly dependent in one vector space and linearly independent in another?

Yes, this is possible. A set of vectors can be linearly dependent in one vector space but linearly independent in another. This is because the definition of linear independence depends on the vector space that the vectors are in. A set of vectors may be linearly dependent in one vector space because they can be written as a linear combination of each other, but in a different vector space, they may not be able to be expressed in this way.

What is the significance of linear independence and dependence in linear algebra?

Linear independence and dependence play a crucial role in linear algebra as they help us understand the relationships between vectors in a vector space. If a set of vectors is linearly independent, then the vectors are considered to be "essential" in defining the vector space. On the other hand, if a set of vectors is linearly dependent, then they can be seen as "redundant" as they can be expressed as a linear combination of other vectors in the vector space.

How does linear independence and dependence relate to the concept of a basis?

A basis is a set of linearly independent vectors that span a vector space. This means that any vector in the vector space can be written as a linear combination of the basis vectors. Therefore, linear independence and dependence are closely related to the concept of a basis. A set of linearly independent vectors can form a basis for a vector space, while a set of linearly dependent vectors cannot form a basis.

Back
Top