Need Explination (linear Indpendance)

  • Thread starter stuckie27
  • Start date
In summary, the columns of A are Linearly Dependant. This is because the last column of AB is entirely zero, indicating that the columns of A are combinations of each other. This is further supported by the fact that B itself has no column of zeros. To clarify, A and B are not contravariant vectors, but rather matrices. AB can be seen as their inner (scalar) product, and the entries in a column of AB represent linear combinations of the columns of A. If a column of zeroes is present, it suggests that the columns of A are not linearly independent, as they can be expressed as combinations of each other.
  • #1
stuckie27
12
0
A and B are both Matricies,

Suppose the last column of AB is entirely zero but B itself has no column of zeros. What can be said about the columns of A?

Answer: The columns of A are Linearly Dependant.

Question: Why?
 
Last edited:
Physics news on Phys.org
  • #2
stuckie27 said:
Suppose the last column o AB is entirely zero but B itself has no column of zeros. What can be said about the columns of A?

Answer: The columns of A are Linearly Dependant.

Question: Why?

Are A and B contravariant vecotrs? is AB their inner (scalar) product? If so, I'm not sure how it can have columns, please clarify.
 
  • #3
edit, A and B are each a different Matrix.
 
  • #4
take a column in AB, what do the entries represent? are they in some way related to linear combinations of the columns of A? (yes they are, that isn't rhetorical) and a column of zeroes might mean that... fill in the blanks using the definition of linear (in)dependence.
 

FAQ: Need Explination (linear Indpendance)

What is linear independence?

Linear independence is a concept in linear algebra where a set of vectors is considered independent if none of the vectors can be expressed as a linear combination of the other vectors in the set. In other words, no vector in the set can be created by scaling or adding any other vectors in the set.

Why is linear independence important?

Linear independence is important because it allows us to understand the relationship between vectors in a set. It helps us determine if a set of vectors is a basis for a vector space, which is crucial in many mathematical and scientific applications.

How do you determine if a set of vectors is linearly independent?

To determine if a set of vectors is linearly independent, we can use the definition and check if any vector in the set can be expressed as a linear combination of the other vectors. Alternatively, we can use the determinant or rank of the matrix formed by the vectors to determine if they are linearly independent.

Can a set of linearly dependent vectors be converted to a set of linearly independent vectors?

Yes, it is possible to convert a set of linearly dependent vectors to a set of linearly independent vectors by removing any redundant vectors. This process is known as vector elimination or reducing the set to its basis.

How is linear independence used in real-world applications?

Linear independence is used in various fields such as physics, engineering, economics, and computer science. It is used to solve systems of equations, analyze data and patterns, and create mathematical models. In physics, it is used to determine the forces acting on a system, while in economics, it is used to analyze market trends and create predictive models.

Back
Top