Proof that det(M)=0 => Linear Dependence of Columns

In summary: I have been told that the three axioms listed in the first paragraph are the only ones that are needed for determinants.
  • #1
bananabandana
113
5

Homework Statement


Prove that for a general NXN matrix, M, det(M)=0 => Linear Dependence of Columns

Homework Equations

The Attempt at a Solution


It's not clear to me at all how to approach this. We've just started Linear algebra and this was stated without proof in lecture. I have no idea how to solve this. Can someone give me a hint about a starting point?

Thanks! :)
 
Physics news on Phys.org
  • #2
bananabandana said:

Homework Statement


Prove that for a general NXN matrix, M, det(M)=0 => Linear Dependence of Columns

Homework Equations

The Attempt at a Solution


It's not clear to me at all how to approach this. We've just started Linear algebra and this was stated without proof in lecture. I have no idea how to solve this. Can someone give me a hint about a starting point?

Thanks! :)

Depends on what you know about determinants. Do you know there are row operations you can do that don't change the determinant or change it only by a sign? Can you show you can reduce any matrix to upper triangular form with those row operations? Then the determinant depends only on the diagonal elements of the upper triangular matrix. Relate the linear independence to the value of those diagonal elements in the upper triangular form.
 
  • Like
Likes bananabandana
  • #3
No I didn't know that. I will look up the method and try to go from there. Thanks!
 
  • #4
It also depends on what definition of determinant you have to work from.
My old textbook (D.T. Finkbeiner II, 1966) defines it by three axioms:
1. linearity wrt columns; i.e. if a column vector v of A can be expressed as a linear sum of two vectors, v = av1+ bv2, and A1, A2 are the matrices consisting of A except that v is replaced by v1, v2 respectively, then det(A) = a det(A1)+b det(A2).
2. If two adjacent columns are equal then det is 0
3. det(I) = 1.
It's not hard to deduce that swapping two adjacent columns switches the sign on det.
From there, you need to extend to switching non-adjacent columns, and so on.
 
  • #5
Would this be a valid solution?

$$ |A|=0 $$ implies there are non-trivial solutions to the equation $$\mathbf{A}\mathbf{x}=0$$. Since, if |A|=0 we know that the equation either has infinite solutions or no solution, since ##\mathbf{x}=\vec{0}## is a solution, there must be infinite solutions.

Matrix ## A ## can be written as the set of column vectors:
$$ A = [\mathbf{a_{1}}, \mathbf{a_{2}}...,\mathbf{a_{n}}] $$ , where ##a_{i} ## is a member of ## R^{N}##.

This implies that :

$$ x_{1}\mathbf{a_{1}}+x_{2}\mathbf{a_{2}} + ... + x_{n}\mathbf{a_{n}} = 0 $$

For some set of ## x_{i} ## which are not all zero. Therefore the column vectors of a matrix are linearly dependent if det|A| =0.
Thanks!
 
  • #6
bananabandana said:
|A|=0 implies there are non-trivial solutions to the equation
Ax=0​
How do you know that? Does it come directly from the definition of determinant that you have been taught, or from some theorem that you are allowed to quote?
 
  • #7
Sorry I was rushed and did not post the proof properly. Hopefully it should be as follows:

1. Proof that ## |A|=0 <==> ## Non-trivial solutions to:
$$ \mathbf{A}\vec{x}=\vec{0} \ (*) $$
i) If ##|A|=0 ## implies (via Cranmer's rule/matrix inversion) that there are either no solutions or infinitely many solutions to (*).
Since ##\vec{x} = \vec{0} ## is a solution, there must be an infinite number of solutions. ## \therefore |A| = 0 \implies ## non-trivial solutions.

ii) If there is one non-trivial solution to (*), since ##\vec{x} = \vec{0} ## is a solution, there must be an infinite number of solutions, therefore we know ##|A| =0##

Hope that is better :)
 
  • #8
bananabandana said:
Sorry I was rushed and did not post the proof properly. Hopefully it should be as follows:

1. Proof that ## |A|=0 <==> ## Non-trivial solutions to:
$$ \mathbf{A}\vec{x}=\vec{0} \ (*) $$
i) If ##|A|=0 ## implies (via Cranmer's rule/matrix inversion) that there are either no solutions or infinitely many solutions to (*).
Since ##\vec{x} = \vec{0} ## is a solution, there must be an infinite number of solutions. ## \therefore |A| = 0 \implies ## non-trivial solutions.

ii) If there is one non-trivial solution to (*), since ##\vec{x} = \vec{0} ## is a solution, there must be an infinite number of solutions, therefore we know ##|A| =0##

Hope that is better :)
That looks ok if you are allowed to quote Cramer's rule (not Cranmer's; I believe Thomas Cranmer's rule was to keep Henry happy). The danger here is that this may be regarded as a more advanced theorem than the one you are trying to prove. This is often a difficulty when asked to prove something which is generally taken as a well known fact. In my view you should attempt to rely only on facts which are evidently more 'primitive'.
What definition of determinant have you been given?
 
  • #10
bananabandana said:
  1. Has just been defined as an operation to retrieve a number from a square matrix - via the Laplace expansion. (https://en.wikipedia.org/wiki/Laplace_expansion)
Then I feel you should try to derive the result directly from that definition and not appeal to any standard theorems.
 

FAQ: Proof that det(M)=0 => Linear Dependence of Columns

What is det(M)?

In linear algebra, det(M) refers to the determinant of a matrix M. It is a value that can be calculated from the entries of a square matrix and provides information about the properties of the matrix.

What does it mean for det(M) to equal 0?

When det(M) equals 0, it means that the matrix M is singular, or non-invertible. This means that there is no unique solution to the system of equations represented by the matrix, and the columns of the matrix are linearly dependent.

How does det(M)=0 relate to linear dependence of columns?

The determinant of a matrix represents the volume or area spanned by the vectors in the matrix. When det(M) equals 0, it means that the volume or area is 0, which indicates that the vectors are linearly dependent and lie on the same plane or line.

Can a matrix have det(M)=0 without having linearly dependent columns?

No, a matrix cannot have a determinant of 0 without having linearly dependent columns. This is because a determinant of 0 means that the matrix is singular, and a singular matrix always has linearly dependent columns.

What is the significance of proving det(M)=0 to show linear dependence of columns?

Proving that det(M)=0 is a mathematical way to show that the columns of a matrix are linearly dependent. This is important in understanding the properties of the matrix and solving systems of equations represented by the matrix.

Back
Top