Linear Independence of \overline{w} and \overline{v} in R4/U

In summary, the conversation discusses the concept of quotient spaces and cosets in linear algebra. The notation R4/U represents a quotient space in which all points in the subspace U are considered equivalent. This is often used in analyzing linear maps. The overline notation is used to represent a coset, which is a set consisting of all points in U moved in the direction of a vector v.
  • #1
smerhej
20
0

Homework Statement


Well it isn't so much the problem as it is the notation used within the problem. But here is the question:

Determine whether or not [itex]\overline{w}[/itex] and [itex]\overline{v}[/itex] are linearly independent in R4/U


Homework Equations


If v [itex]\in[/itex] V then [itex]\overline{v}[/itex] = v + U


The Attempt at a Solution



I don't understand the notation R4/U (I understand R4 and the subspace U, but don't understand the slash between them)
 
Physics news on Phys.org
  • #2
it's called a "quotient space" and its elements are called "cosets" and consist of "parallel translates of a subspace by a vector".

that's why v has the overline: it's the SET:

{v+u: u in U}.

which can be thought of as the entire subspace U, moved in the direction/distance of v.

another way to think of it, is as regarding the entire space (in this case R4) losing dim(U) dimensions, by regarding all points in U as "equivalent (essentially 0)".

if dim(U) = 1, each coset is a parallel line, and you need a 3-vector to tell you "which line".

if dim(U) = 2, each coset is a parallel plane, and you need a 2 vector to tell you "which plane".

higher dimensions are harder to visualize, but the same sort of logic applies.

simce v+U is a set, v is just a "representative", and the same coset v+U can have different representatives.

one common way quotient spaces arise is in analyzing linear maps: often, we don't care about the kernel of a linear map (because everything in it just maps to the 0-vector), so we "mod it out". the resulting quotient space is isomorphic to the image space (this is pretty much equivalent to the rank-nullity theorem, but in a more abstract setting).

you calculate with elements in R4/U pretty much like you do with elements in R4, but with a "+U" along for the ride:

v+U + w+U = (v+w)+U
a(v+U) = av+U

the overline notation is a bit "cleaner" but hides some of what is going on.
 

FAQ: Linear Independence of \overline{w} and \overline{v} in R4/U

What does it mean for vectors \overline{w} and \overline{v} to be linearly independent in R4/U?

Linear independence in R4/U means that the vectors \overline{w} and \overline{v} cannot be written as linear combinations of each other. In other words, there is no scalar multiple of one vector that can be added to the other to create either vector.

How can I determine if \overline{w} and \overline{v} are linearly independent in R4/U?

To determine if two vectors \overline{w} and \overline{v} are linearly independent in R4/U, you can use the following method: create a matrix with the vectors as its columns, and then perform row reduction on the matrix. If the matrix reduces to an identity matrix, then the vectors are linearly independent. If the matrix reduces to a matrix with a row of zeros, then the vectors are linearly dependent.

Why is linear independence important in R4/U?

Linear independence is important in R4/U because it is closely related to the concept of a basis. A basis is a set of linearly independent vectors that span the entire vector space. This means that any vector in the vector space can be written as a linear combination of the basis vectors. Linear independence helps us determine a minimal set of vectors that can represent all other vectors in the space.

Can more than two vectors be linearly independent in R4/U?

Yes, more than two vectors can be linearly independent in R4/U. In fact, in a vector space of dimension n, a set of n linearly independent vectors is referred to as a basis for that vector space. Any vector space with a basis can have more than two linearly independent vectors.

What is the relationship between linear independence and the determinant of a matrix?

The determinant of a matrix is related to linear independence in that if the determinant of a matrix is non-zero, then the columns of that matrix are linearly independent. In other words, if a matrix has a non-zero determinant, then its columns cannot be written as linear combinations of each other. This can be used as a quick check for linear independence in some cases.

Similar threads

Back
Top