Linear Independence of Vectors: Why Determinant ≠ 0?

In summary, a square matrix has an inverse if and only if the vectors in it are linearly independent.
  • #1
Petrus
702
0
Hello MHB,
I got one question. If we got this vector \(\displaystyle V=(3,a,1)\),\(\displaystyle U=(a,3,2)\) and \(\displaystyle W=(4,a,2)\) why is it linear independence if determinant is not equal to zero? (I am not interested to solve the problem, I just want to know why it is)

Regards,
\(\displaystyle |\pi\rangle\)
 
Physics news on Phys.org
  • #2
Petrus said:
Hello MHB,
I got one question. If we got this vector \(\displaystyle V=(3,a,1)\),\(\displaystyle U=(a,3,2)\) and \(\displaystyle W=(4,a,2)\) why is it linear independence if determinant is not equal to zero? (I am not interested to solve the problem, I just want to know why it is)

Regards,
\(\displaystyle |\pi\rangle\)

If you put your vectors in a matrix, you get a linear function identified by the matrix.
If you can "reach" all of $\mathbb R^3$ with this function, your vectors are linearly independent.
The determinant shows if this is possible. Zero means it's not.
 
  • #3
I like Serena said:
If you put your vectors in a matrix, you get a linear function identified by the matrix.
If you can "reach" all of $\mathbb R^3$ with this function, your vectors are linearly independent.
The determinant shows if this is possible. Zero means it's not.
Thanks! I start to understand now!:)

Regards,
\(\displaystyle |\pi\rangle\)
 
  • #4
Is this state true.
1. We can check if a vector is linear Independence by checking if the determinant is not equal to zero
2. For a linear independence matrix there exist a inverse
3.
I did find this theorem in internet that don't say in my book.
"If amounts of vector \(\displaystyle v_1,v_2,v_3...v_p\) is in a dimension \(\displaystyle \mathbb R^n\)
then it is linear dependen if \(\displaystyle p>n\)

Regards,
\(\displaystyle |\pi\rangle\)
 
Last edited:
  • #5
Petrus said:
Is this state true.
1. We can check if a vector is linear Independence by checking if the determinant is not equal to zero
2. For a linear independence matrix there exist a inverse
3.
I did find this theorem in internet that don't say in my book.
"If amounts of vector \(\displaystyle v_1,v_2,v_3...v_p\) is in a dimension \(\displaystyle \mathbb R^n\)
then it is linear dependen if \(\displaystyle p>n\)

Regards,
\(\displaystyle |\pi\rangle\)

You can only calculate the determinant of a square matrix.
That means you can only use a determinant to check independence of n vectors, each of dimension n.

An inverse can only exist for a square matrix.
If the matrix is square and the vectors in it are linearly independent, then there exists an inverse.

If you have n linearly independent vectors, they span an n-dimensional space, like $\mathbb R^n$.
If you have one more vector, that won't fit in that n-dimensional space anymore in an independent manner.
So a set of n+1 vectors in $\mathbb R^n$ will have to be dependent.
 
  • #6
I like Serena said:
You can only calculate the determinant of a square matrix.
That means you can only use a determinant to check independence of n vectors, each of dimension n.

An inverse can only exist for a square matrix.
If the matrix is square and the vectors in it are linearly independent, then there exists an inverse.

If you have n linearly independent vectors, they span an n-dimensional space, like $\mathbb R^n$.
If you have one more vector, that won't fit in that n-dimensional space anymore in an independent manner.
So a set of n+1 vectors in $\mathbb R^n$ will have to be dependent.
Thanks, I meant a square matrix :)

Regards,
\(\displaystyle |\pi\rangle\)
 

Related to Linear Independence of Vectors: Why Determinant ≠ 0?

1. What is the definition of linear independence of vectors?

Linear independence of vectors refers to the relationship between two or more vectors in a vector space. It means that none of the vectors can be written as a linear combination of the other vectors.

2. Why is it important to determine if vectors are linearly independent?

Determining if vectors are linearly independent is important because it helps us understand the relationships between vectors and their span in a vector space. It also allows us to solve systems of linear equations and perform other mathematical operations.

3. What is the formula for calculating the determinant of a matrix?

The formula for calculating the determinant of a matrix is:
det(A) = a11*a22 - a12*a21.
This formula can be extended to larger matrices by using expansion along a row or column.

4. How do you know if vectors are linearly independent based on their determinant?

If the determinant of a matrix formed by the vectors is equal to 0, then the vectors are linearly dependent. This means that one of the vectors can be written as a linear combination of the other vectors. If the determinant is not equal to 0, then the vectors are linearly independent.

5. Can linearly independent vectors become linearly dependent?

Yes, it is possible for linearly independent vectors to become linearly dependent. This can happen when one vector is a multiple of another vector or when one vector can be written as a linear combination of the other vectors.

Similar threads

  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
1K
Replies
1
Views
714
Replies
10
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
982
Back
Top