Linear Algebra Question on Isomorphism

In summary, the homework statement is trying to prove that (U+W)/W is isomorphic to U/(U \cap W). The attempt at a solution is to try to figure out what the elements of the two quotient spaces look like and then try to prove that the elements are the same. The problem is that they are not always the same and the proof is the hard part.
  • #1
jmb576
5
0

Homework Statement


Let U be a finite dimensional vector space and suppose that U and W are nonzero subspaces of V prove that (U+W)/W is isomorphic to U/(U [itex]\cap[/itex] W).


Homework Equations


Here the use of / denotes a quotient space.


The Attempt at a Solution


Not even sure where to begin.
 
Physics news on Phys.org
  • #2
Why don't you try starting by figuring out what the elements of the two quotient spaces look like?
 
  • #3
The first space is given by [tex] (U+W)/W = \{ (u +w) + W\; \vert \; u \in U, \quad w \in W \}[/tex]
and the second by [tex] U/(U \cap W) = \{ u + U \cap W\; \vert \; u \in U \}. [/tex]
I still am not sure how proceed.
 
  • #4
Suppose you have [itex]x, y \in U+W[/itex] so that [itex]x = u_1 + w_1[/itex] and [itex]y = u_2 + w_2[/itex] where [itex]u_1, u_2 \in U[/itex] and [itex]w_1, w_2 \in W[/itex]. How do you know if the elements [x] and [y] of the quotient space (U+W)/W are the same?
 
  • #5
If [itex] x - y \in W [/itex] then aren't the two elements in [itex] (U+W)/W [/itex] equal?
 
  • #6
Yes, so if [itex](u_1-u_2) + (w_1-w_2) \in W[/itex], the two elements are equal. You should be able to convince yourself that's true if [itex]u_1 - u_2 \in W[/itex].

By the way, I don't know exactly how the proof will work out, but I think seeing what the equivalence classes from each of the quotient spaces look like will probably help you get started.
 
  • #7
Yes okay so if the same constraint arises for the other quotient space they are isomorphic?
 
  • #8
Maybe. You need to show there's a bijection between the equivalence classes of the two spaces. If the same condition arises, it might suggest what that bijection could be.
 
  • #9
I'm still having trouble figuring out how to progress from here.
 
  • #10
some thoughts:

is u+w+W the same coset as u+W in (U+W)/W? (can you prove this?)

suppose it was. how could you map an element

u+W in (U+W)/W to an element u+(U∩W) in U/(U∩W)?

things to check:

is my mapping well-defined?

(that means be SURE that if u' is in the same coset as u, in (U+W)/U

we map u' to the same coset as we map u to, in U/(U∩W)).

this is the hard part. once you have your mapping

(let's call it f), pick a basis B for (U+W)/W, and show that

f(B) is a basis for U/U∩W, and then f is an isomorphism if f is linear.
 
  • #11
Note that the component of W of a vector in U+W does not contribute to distinguishing elements.
So each vector-class in U+W/W is uniquely defined by its component in (U-W)[itex]\cup[/itex]{0}.

Does the same hold for U/(U[itex]\cap[/itex]W)?

If that would be true, we have a bijection.
Then the only thing left to proof would be that it is a linear map.
 
  • #12
I like Serena said:
Note that the component of W of a vector in U+W does not contribute to distinguishing elements.
So each vector-class in U+W/W is uniquely defined by its component in (U-W)[itex]\cup[/itex]{0}.

Does the same hold for U/(U[itex]\cap[/itex]W)?

If that would be true, we have a bijection.
Then the only thing left to proof would be that it is a linear map.

the linearity is going to be trivial (because we are dealing with subspaces). proving we have a map is the hardest part.
 
  • #13
Deveno said:
the linearity is going to be trivial (because we are dealing with subspaces). proving we have a map is the hardest part.

Yes, so we can define 2 maps.
f1 from (U+W)/W to (U-W)[itex]\cup[/itex]{0}.
f2 from U/(U[itex]\cap[/itex]W) to (U-W)[itex]\cup[/itex]{0}.

Would the map f2-1 [itex]\circ[/itex] f1 be a suitable map?

Vectors in U-W are unique.
Vectors in W or in (U[itex]\cap[/itex]W) map to the zero-vector.
 
  • #14
I like Serena said:
Yes, so we can define 2 maps.
f1 from (U+W)/W to (U-W)[itex]\cup[/itex]{0}.
f2 from U/(U[itex]\cap[/itex]W) to (U-W)[itex]\cup[/itex]{0}.

Would the map f2-1 [itex]\circ[/itex] f1 be a suitable map?

Vectors in U-W are unique.
Vectors in W or in (U[itex]\cap[/itex]W) map to the zero-vector.

in a finite-dimensional vector space, sure (or even in an infinite-dimensional one with countable basis).

in the general case, the existence of f2-1 appeals to the axiom of choice. while i personally believe (for precisely these kinds of situations) that the axiom of choice is valid, since this is but a mere linear algebra question, it would be preferable to avoid it.

the trouble is, that in U+W, we do not have a unique decomposition of an element x as u+w. modding out W is exactly what we need to "fix" this problem, since if:

u+w = u'+w', then u-u' = w-w'.

now we have an equivalence on U by "something" (and that something has to be a subset of U, to be a linear congruence on U, and also clearly has to annihilate W as well).

we're saying the same thing, but in different ways. I'm appealing to group theory (ok, free F-modules, but we're all friends here, right?), you're appealing to the underlying sets, with a linear structure on them.

where you say "is uniquely defined by its component in..." i say "...is well-defined". same song, just covered by different bands.
 
  • #15
It seems to me that the simplest thing to do is calculate the dimension of these subspaces. Finite dimensional vector spaces, with the same dimension, are isomorphic.
 
  • #16
well that is another thing one can do.

of course, coming up with a basis for (U+W)/W isn't an intuitively obvious thing to do.

it might help to observe that if {w1,...,wk} is a basis for W, we can extend this to a basis {w1,...,wk,u1,...,um} of U+W.

this gives a good idea of which elements one might use for a basis for (U+W)/W.

but then you have a dilemma. if you start with {u1,...,um}, you might not yet have a basis for U, so you add more elements until you do. the thing is, in order to get the dimensional agreement we want, we need to show that any additional basis elements we choose have to be in W.

of course, if we have a formula for dim(V/U) for any vector space V and subspace U, we can neatly side-step all this. while that does tie things up nice and pretty, it avoids producing the isomorphism, so we somehow "know" these two items are isomorphic, but have no idea "why".

of course, in actual practice, a working linear algebraist (are there such things?) would probably do that (use a dimensional argument). however, quotient spaces confuse people on first exposure, so it's good practice to "get their hands dirty" and see the guts of how they work.

dimensional arguments are "implicit" (they refer to derived facts). exhibiting an isomorphism is "explicit". sometimes one is easy, and one is hard. it's a good thing for people to see they have the same industrial-cleaning strength.
 
  • #17
Deveno said:
the trouble is, that in U+W, we do not have a unique decomposition of an element x as u+w. modding out W is exactly what we need to "fix" this problem,

But we do.
The sets:
1. (U-W)[itex]\cup[/itex]{0},
2. U[itex]\cap[/itex]W,
3. (W-U)[itex]\cup[/itex]{0}​
are linearly independent vector spaces.
So any vector in U+W has a unique decomposition into a vector of each of these sets.

There's no need to construct such a unique decomposition - it suffices that it has to exist.
 
  • #18
I like Serena said:
But we do.
The sets:
1. (U-W)[itex]\cup[/itex]{0},
2. U[itex]\cap[/itex]W,
3. (W-U)[itex]\cup[/itex]{0}​
are linearly independent vector spaces.
So any vector in U+W has a unique decomposition into a vector of each of these sets.

There's no need to construct such a unique decomposition - it suffices that it has to exist.

(U-W)U{0} is not usually a vector space.

for example: let U = span({(1,0,0),(0,0,1)}), W = span({(1,0,0),(0,1,0)}).

clearly u = (1,0,1) and v = (1,0,-1) are in U, but not in W.

however, u+v = (2,0,0) is in W, so we do not have closure, (U-W)U{0} is not a vector space.

what you want to do, and this is akin to HallsofIvy's approach, is break down a basis of U+W, into one of these 3 disjoint sets:

a)U-W
b)W-U
c)U∩W

you can't use the sets themselves to partition U+W, because U+W is a lot bigger than U∪W (the union usually isn't even a vector space, the union of the subspaces (x,0) and (0,y) in R2 is just the x and y axes, which is certainly not all of R2).
 
  • #19
You are right. I was sloppy with my notation.

But how do you make it (U-W)?
Is there a convention somewhere that this defines what I intended?
 
  • #20
I like Serena said:
You are right. I was sloppy with my notation.

But how do you make it (U-W)?
Is there a convention somewhere that this defines what I intended?

when making bases, one never includes the 0-vector, as it automatically forces linear dependence. we know that U∪W generates U+W, and that is all we need to construct a basis, is a generating set. thus...we pick U-vectors until we get a maximal LI set, if some of those are also in W, we put them in U∩W.

then we pick more vectors from W until we get a basis for U+W. again, if some of these are in U, we stick those with the U∩W basis vectors.

to get the unique representation you are thinking of, we "lump all the W-vectors together".

this sends u+w = (linear combination in only U) + (linear combination in W) + (stuff with U and W mixed) all to in U+W/W. now u may not be a "pure" U-vector, but we don't care, we've killed all the "W" part by modding by W. that way, we don't have to worry about it.

ok, suppose you have a plane U in R3. and then you take some other plane W, and create a space. now we're thinking of R3 as U+W.

if we use W to "slice" R3, now we have something like an infinite deck of cards, all lining up nicely with the plane W. each point of R3 lies on only one of these W-slices (cosets), which we can label with some point u in U that that particular slice passes through. note we lose some information doing this, we don't know "where" on the W-slice our point of R3 lies on: we've lost two dimensions (in other words, we don't need two basis vectors for U, to determine which W-slice we're on, we can just pick one of them, and scale it until it ends on the W-slice we're on).

but we can look at this situation another way: we can use the intersection of U and W, which is some line on U, and just determine which parallel copy of U∩W any point in U lies on. to recover the W-slice, we just use that same point u of U, that lies on the translated line of U∩W to specify a W-slice. conversely, we can specify a parallel line to U∩W, by specifiying which W-slice we're on, and using the line that is the intersection of that W-slice with U. either way, one vector in U is all we need to recover both situations.

my point is, using U and W, is too much information to identify a point in R3. we could imagine 2 intersecting translates of each plane, but it's pretty clear we'd get a lot of duplicate 4-coordinate labels. but when we shrink W to a point {0}, so that we can think only in terms of U, we also collapse 1 dimension of U (the line of the intersection). now we have a unique representation "only in U-vectors", but we have this rather large W-equivalent set attached to each U-vector.

i don't know if that's very clear or not, but it's geometrically why modding a span-set by a subspace winds up being the same as a quotient of one of the subspaces in the span.

in actual practice, we tend to look for subspaces that only intersect in a point, like the x-axis, and the y-axis. then we do have uniqueness of u+w, and modding out W gives us U.
 

Related to Linear Algebra Question on Isomorphism

1. What is the definition of isomorphism in linear algebra?

In linear algebra, isomorphism refers to a linear transformation between two vector spaces that preserves the structure of the vector space. This means that the linear transformation preserves addition and scalar multiplication, and also maps one-to-one and onto.

2. How do you prove that two vector spaces are isomorphic?

To prove that two vector spaces are isomorphic, you need to show that there exists a linear transformation between the two spaces that is one-to-one and onto. This can be done by constructing an explicit mapping or by showing that the dimensions of the two spaces are equal.

3. What is the significance of isomorphism in linear algebra?

Isomorphism is significant in linear algebra because it allows us to study different vector spaces and their properties by relating them to each other through a linear transformation. It also helps us to simplify and solve complex problems by transforming them into simpler, isomorphic spaces.

4. Can two isomorphic vector spaces have different bases?

Yes, two isomorphic vector spaces can have different bases. Isomorphic spaces have the same dimension, but the bases are not necessarily the same. This means that the same vector in one space may be represented by a different linear combination of basis vectors in the other space.

5. How is isomorphism related to matrix multiplication?

In linear algebra, matrix multiplication is used to represent linear transformations between vector spaces. Isomorphism is related to matrix multiplication because an isomorphic relationship between two vector spaces means that there exists a matrix that can transform vectors from one space to the other through matrix multiplication.

Similar threads

  • Calculus and Beyond Homework Help
Replies
1
Views
778
  • Calculus and Beyond Homework Help
Replies
0
Views
555
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
15
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
825
  • Calculus and Beyond Homework Help
Replies
7
Views
585
  • Calculus and Beyond Homework Help
Replies
5
Views
925
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
2K
Back
Top