Proving Linear Dependence and Span in n-dimensional Space

  • MHB
  • Thread starter mathmari
  • Start date
  • Tags
    Span
In summary, the conversation discusses how to prove various properties related to linear spans and linear dependence of vectors in a vector space. There is a discussion about the definition of a linear span and how to show that the linear span of a subset is a subset of the linear span of the entire set. There is also a question about the meaning of the notation "\{1, ..., k\} = \{i_1, ..., i_k\}" and whether it is related to the order of vectors in a linear combination.
  • #1
mathmari
Gold Member
MHB
5,049
7
Hey! :eek:

Let $1\leq n,k\in \mathbb{N}$ and let $v_1, \ldots , v_k\in \mathbb{R}^k$. Show that:
  1. Let $w\in \text{Lin}(v_1, \ldots , v_k)$. Then it holds that $\text{Lin}(v_1, \ldots , v_k)=\text{Lin}(v_1, \ldots , v_k,w)$.
  2. Let $v_1, \ldots , v_k$ be linearly dependent. Thn there is a $1\leq i\leq k$ and $\lambda_1, \ldots , \lambda_k$ such that $v_i=\lambda_1v_1+\ldots +\lambda_{i-1}v_{i-1}+\lambda_{i+1}v_{i+1}+\ldots +\lambda_nk_n$.
  3. Let $i_1, \ldots i_k\in \mathbb{N}$, such that $\{1, \ldots , k\}=\{i_1, \ldots , i_k\}$. Then it holds that $\text{Lin}(v_1, \ldots , v_k)=\text{Lin}(v_{i_1}, \ldots , v_{i_k})$.
  4. Let $v_1, \ldots , v_k$ be linearly dependent. Then there is a $1\leq i\leq k$ such that $\text{Lin}(v_1, \ldots , v_k)=\text{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)$.

I have already shown the first two points. Could you please give me a hint fot the point $3$ ? (Wondering) As for point $4$ : Do we use here the point $2$ ? Suppose $v_i=\lambda_1v_1 +\ldots \lambda_{i-1}v_{i-1}+\lambda_{i+1}v_{i+1}+\ldots +\lambda_kv_k$. Then it holds that $\text{Lin}(v_1, \ldots , v_k)\subseteq \text{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)$, or not?
No it is left to show that $\text{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)\subset \text{Lin}(v_1, \ldots , v_k)$, or not?

Or is there an other for this proof?

(Wondering)
 
Physics news on Phys.org
  • #2
For (3), what does "[tex]\{1, …, k\}= \{i_1, …, i_k\}[/tex]" mean? With standard set notation that would just mean that [tex]v_1= v_{i_1}[/tex], …, [tex]v_k= v_{i_k}[/tex] but then the problem is trivial. Or is the point that the order doesn't matter? Then the problem is almost trivial- just using the fact that vector addition is commutative.
 
  • #3
mathmari said:
Hey! :eek:

Let $1\leq n,k\in \mathbb{N}$ and let $v_1, \ldots , v_k\in \mathbb{R}^k$. Show that:
4. Let $v_1, \ldots , v_k$ be linearly dependent. Then there is a $1\leq i\leq k$ such that $\text{Lin}(v_1, \ldots , v_k)=\text{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)$.

As for point $4$ : Do we use here the point $2$ ? Suppose $v_i=\lambda_1v_1 +\ldots \lambda_{i-1}v_{i-1}+\lambda_{i+1}v_{i+1}+\ldots +\lambda_kv_k$. Then it holds that $\text{Lin}(v_1, \ldots , v_k)\subseteq \text{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)$, or not?

Hey mathmari!

Normally we start from the definition.
From wiki:
The vectors in a subset $S=\{\vec v_1,\vec v_2,\dots,\vec v_k\}$ of a vector space $V$ are said to be ''linearly dependent'', if there exist scalars $a_1,a_2,\dots,a_k$, not all zero, such that
$$a_1\vec v_1+a_2\vec v_2+\cdots+a_k\vec v_k= \vec 0,$$
where $\vec 0$ denotes the zero vector.

Let $a_i$ be one of those scalars that is not zero.
Then:
$$a_1\vec v_1+a_2\vec v_2+\cdots+a_k\vec v_k= \vec 0
\implies \vec v_i = -\frac{1}{a_i}\left(a_1 \vec v_1+\cdots + a_{i-1}\vec v_{i-1}+ a_{i+1}\vec v_{i+1}+\cdots+a_k\vec v_k\right)
$$
So $\vec v_i \in \operatorname{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)$, isn't it? (Wondering)
mathmari said:
No it is left to show that $\text{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)\subset \text{Lin}(v_1, \ldots , v_k)$, or not?

Yes, and that follows from the definition of a linear span, doesn't it?
What is the definition of a linear span? (Wondering)
 
  • #4
Klaas van Aarsen said:
Let $a_i$ be one of those scalars that is not zero.
Then:
$$a_1\vec v_1+a_2\vec v_2+\cdots+a_k\vec v_k= \vec 0
\implies \vec v_i = -\frac{1}{a_i}\left(a_1 \vec v_1+\cdots + a_{i-1}\vec v_{i-1}+ a_{i+1}\vec v_{i+1}+\cdots+a_k\vec v_k\right)
$$
So $\vec v_i \in \operatorname{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)$, isn't it? (Wondering)

So this direction follows from point 2., doesn't t? (Wondering)
Klaas van Aarsen said:
Yes, and that follows from the definition of a linear span, doesn't it?
What is the definition of a linear span? (Wondering)

Let $x\in \text{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)$. Then $x$ is a linear combination of the elements $v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k$, i.e. \begin{equation*}x=\lambda_1v_1+ \ldots + \lambda_{i-1}v_{i-1}+\lambda_{i+1} v_{i+1}+ \ldots+ \lambda_kv_k\end{equation*} Then we can write this element also as follows \begin{equation*}x=\lambda_1v_1+ \ldots + \lambda_{i-1}v_{i-1}+0\cdot v_i+\lambda_{i+1} v_{i+1}+ \ldots+ \lambda_kv_k\end{equation*} and now it is a linear combination of the elements $v_1, \ldots , v_{i-1}, v_i,v_{i+1}, \ldots, v_k$ and this means that $x\in \text{Lin}(v_1, \ldots , v_k)$.

So we get that $\text{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)\subseteq \text{Lin}(v_1, \ldots , v_k)$. Is everything correct? (Wondering)

- - - Updated - - -

HallsofIvy said:
For (3), what does "[tex]\{1, …, k\}= \{i_1, …, i_k\}[/tex]" mean? With standard set notation that would just mean that [tex]v_1= v_{i_1}[/tex], …, [tex]v_k= v_{i_k}[/tex] but then the problem is trivial. Or is the point that the order doesn't matter? Then the problem is almost trivial- just using the fact that vector addition is commutative.

I am also a bit confused about the meaning. I think that your second assumption is meant, since the first were too easy. (Thinking)

So do we have to show that at the linear combination we can change the order of the vectors? (Wondering)
 

FAQ: Proving Linear Dependence and Span in n-dimensional Space

What is the definition of linear dependence?

Linear dependence refers to the relationship between two or more vectors in a vector space. It means that one vector can be expressed as a linear combination of the other vectors.

How do you prove linear dependence?

To prove linear dependence, you must show that at least one vector in a set of vectors can be written as a linear combination of the other vectors. This can be done by setting up a system of equations and solving for the coefficients of the linear combination.

What is the difference between linear dependence and linear independence?

Linear independence refers to a set of vectors that cannot be written as a linear combination of each other. In other words, the vectors are not dependent on each other. Linear dependence, on the other hand, means that at least one vector can be expressed as a linear combination of the other vectors.

How do you prove span in n-dimensional space?

To prove span in n-dimensional space, you must show that every vector in the space can be written as a linear combination of the given set of vectors. This can be done by solving for the coefficients of the linear combination using a system of equations.

Why is proving linear dependence and span important in mathematics?

Proving linear dependence and span is important because it allows us to understand the relationships between vectors in a vector space. It also helps us to determine if a set of vectors is a basis for the space, which is crucial in many mathematical applications such as solving systems of linear equations and finding eigenvalues and eigenvectors.

Similar threads

Replies
24
Views
1K
Replies
9
Views
1K
Replies
10
Views
1K
Replies
7
Views
2K
Replies
15
Views
1K
Replies
8
Views
3K
Back
Top