Are x1, x2, and x3 linearly dependent in R^n?

In summary, we have three vectors x1, x2, x3 that are linearly independent in R^n and we are asked if this is also true for their corresponding y1, y2, y3. By setting up a matrix and checking its row echelon form, we can see that these vectors are actually linearly dependent. However, we can also note that y3 can be written as the sum of y1 and y2, which supports this conclusion. The longer method of determining linear independence would also be valid in this case.
  • #1
DMOC
100
0
y1 = x2 - x1

y2 = x3 - x2

y3 = x3 - x1

x1, x2, x3 are linearly independent in [itex]R^n[/itex]

Is that true for y1 y2 and y3?

---

Well, normally with linear indepence problems, I can set up a matrix and check to see if the row echelon form has free variables or not, or I can calculate the determinant of a square matrix. Here, I'm just given vectors x1 x2 and x3. What I did was set up a matrix like this with x1 as the first row, x2 as the second, and x3 as the third:

-1 1 0 = 0
0 -1 1 = 0
-1 0 1 = 0

And I end up with the following matrix

1 0 -1 0
0 1 -1 0
0 0 0 0

So I assume these vectors are linearly dependent (not independent) due to the free "variable" of x3?
 
Physics news on Phys.org
  • #2
Or you might note y3 = y1 + y2.
 
  • #3
Yes, that should be correct ... my bad. But would the longer method be valid? i.e. mathematically legal?
 
  • #4
DMOC said:
Yes, that should be correct ... my bad. But would the longer method be valid? i.e. mathematically legal?

Yes.
 
  • #5
Thank you.
 

Related to Are x1, x2, and x3 linearly dependent in R^n?

What is linear independence?

Linear independence refers to the property of a set of vectors in a vector space where no vector in the set can be written as a linear combination of the other vectors.

Why is proving linear independence important?

Proving linear independence is important because it allows us to determine if a set of vectors can be used as a basis for a vector space. It also helps us understand the structure and relationships between vectors in a vector space.

What is a linear independence proof?

A linear independence proof is a mathematical method used to show that a set of vectors is linearly independent. This involves writing out the equations of linear combinations and using algebraic techniques to prove that the only solution is the trivial one (where all coefficients are equal to zero).

What are the steps to proving linear independence?

The steps to proving linear independence are as follows:

  • Write out the equations of linear combinations using the given set of vectors.
  • Set up a system of equations using the coefficients from the linear combinations.
  • Solve the system of equations using algebraic techniques.
  • If the only solution is the trivial one, the set of vectors is linearly independent. If there are other solutions, the set is linearly dependent.

What are some common techniques used in linear independence proofs?

Some common techniques used in linear independence proofs include setting up and solving systems of equations, using Gaussian elimination, and utilizing properties of determinants and matrices.

Similar threads

  • Calculus and Beyond Homework Help
Replies
11
Views
1K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
4
Views
911
  • Calculus and Beyond Homework Help
Replies
9
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
24
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
3K
  • Calculus and Beyond Homework Help
Replies
3
Views
620
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
1K
Back
Top