Linear algebra: subspaces, linear independence, dimension

In summary: A vector is perpendicular to another vector if and only if their dot product is 0. So what equation would you use to find the space of all vectors that are perpendicular to v?
  • #1
morsel
30
0

Homework Statement


1. Consider three linearly independent vectors v1, v2, v3 in Rn. Are the vectors v1, v1+v2, v1+v2+v3 linearly independent as well?

2. Consider a subspace V of Rn. Is the orthogonal complement of V a subspace of Rn as well?

3. Consider the line L spanned by
[1
2
3]. Find a basis of the orthogonal complement of L.

4. Consider a nonzero vector v in Rn. What is the dimension of the space of all vectors in Rn that are perpendicular to v?


Homework Equations





The Attempt at a Solution


1. I think since the three vectors are linearly independent, adding them together doesn't create redundancies. But this seems like an inadequate explanation..

2. The orthogonal complement of V is the kernel of V. Since the kernel is a subspace, the orthogonal complement is a subspace as well.

3. I was thinking of row reducing this matrix to get the orthogonal complement..
[1 0
2 0
3 0]

4. I don't even know where to start with this one..

Thanks in advance!
 
Physics news on Phys.org
  • #2
morsel said:

Homework Statement


1. Consider three linearly independent vectors v1, v2, v3 in Rn. Are the vectors v1, v1+v2, v1+v2+v3 linearly independent as well?

2. Consider a subspace V of Rn. Is the orthogonal complement of V a subspace of Rn as well?

3. Consider the line L spanned by
[1
2
3]. Find a basis of the orthogonal complement of L.

4. Consider a nonzero vector v in Rn. What is the dimension of the space of all vectors in Rn that are perpendicular to v?


Homework Equations





The Attempt at a Solution


1. I think since the three vectors are linearly independent, adding them together doesn't create redundancies. But this seems like an inadequate explanation..
Yes, I agree about your explanation. A better explanation would be to show that v1, v1 + v2, and v1 + v2 + v3 are linearly independent.
morsel said:
2. The orthogonal complement of V is the kernel of V. Since the kernel is a subspace, the orthogonal complement is a subspace as well.
No the orthogonal complement of V is not the kernel of V. It's the set of all vectors in Rn that are perpendicular to each vector in V.
morsel said:
3. I was thinking of row reducing this matrix to get the orthogonal complement..
[1 0
2 0
3 0]
The vector you are given is the basis of a one-dimensional subspace of R3. That means that the orthogonal complement of L is a two-dimensional subspace of R3. If you take an arbitrary vector in the orthogonal complement of L and dot it with your given vector, what should you get?
morsel said:
4. I don't even know where to start with this one..
I'm not sure you understand what orthogonal complement means. Look up the definition. This is related to what is being asked in #4.
 
  • #3
Thanks for your help. I understand how to approach the other problems but I'm still unsure about #4.

Is the dimension n-1 because there's a free variable? In other words, n-1 number of leading 1's in the rref form?
 
  • #4
morsel said:
Thanks for your help. I understand how to approach the other problems but I'm still unsure about #4.

Is the dimension n-1 because there's a free variable? In other words, n-1 number of leading 1's in the rref form?
Yes, the dimension of the subspace of vectors in Rn that are perpendicular to the given vector v is n - 1. No, it's not because there's a free variable - it's because there are n - 1 free variables.

What matrix in RREF form are you talking about? There is only one equation, and it has n variables. Do you know where this equation comes from?
 

FAQ: Linear algebra: subspaces, linear independence, dimension

What is a subspace in linear algebra?

A subspace in linear algebra is a subset of a vector space that satisfies the three properties of closure under addition, closure under scalar multiplication, and contains the zero vector. In other words, a subspace is a smaller vector space contained within a larger vector space.

How do you determine if a set of vectors is linearly independent?

A set of vectors is linearly independent if none of the vectors in the set can be written as a combination of the other vectors in the set. This can be determined by setting up a system of equations and solving for the coefficients of the linear combinations. If the only solution is all coefficients being equal to zero, then the set of vectors is linearly independent.

What is the dimension of a vector space?

The dimension of a vector space is the number of vectors in a basis for that space. A basis is a set of linearly independent vectors that span the entire vector space. The dimension is also equal to the number of components in each vector in the space.

Can a vector space have more than one basis?

Yes, a vector space can have multiple bases. This is because there can be more than one set of linearly independent vectors that can span the entire space. However, all bases for a given vector space will have the same number of vectors, which is equal to the dimension of the space.

How is linear algebra used in real-world applications?

Linear algebra is used in a variety of fields, including engineering, physics, economics, and computer science. It is used to solve systems of linear equations, perform data analysis and modeling, and create computer graphics. It is also used in machine learning and artificial intelligence to optimize algorithms and make predictions based on large datasets.

Back
Top