Proving the Converse: Is H a Hyperplane?

  • Thread starter psholtz
  • Start date
In summary, the conversation is about a problem involving hyperplanes and factor spaces in vector spaces. The problem statement is to prove a certain condition for elements in a given set H, and in the attempt at a solution, the forward direction is shown to be correct and the converse direction is given and then proven to be correct. The converse direction involves showing that the set H is actually a hyperplane.
  • #1
psholtz
136
0

Homework Statement


I'm working on a problem involving hyperplanes and factor spaces. It involves a bit of setup. I'll describe first the definitions. Suppose you have a vector space K, of dimension n. Suppose you have a linear subspace L of K. Choose a vector [tex]x_0 \in K[/tex], then the hyperplane H can be defined as the set of all x such that:

[tex]x = x_0 + y[/tex]

where y ranges over the linear space L.

In simple terms, suppose we are working in R2. Then the line y=x+1 would be a hyperplane of dimension 1 (w/ "offset" vector x0=1), while the line y=x would be a linear subspace of R2.

This correlates roughly w/ the idea of homogeneous and inhomogeneous solutions to a linear system (or linear differential equation). The solution space of all x of the linear system Ax=0 is a linear space, and the "particular" solution would derived from Ax=b. The "total" solution of the inhomogeneous system would then be the "particular" solution + the linear system to the homogeneous system.

The problem statement is: given [tex]x,y \in H[/tex] where H is a hyperplane in a vector space K of dimension n. Then prove that [tex]\alpha x + (1-\alpha)y \in H[/tex] for all [tex]\alpha[/tex]. Conversely, supposing this condition to be true for some subset H of K, prove that H must be a hyperplane.

Homework Equations


See above.

The Attempt at a Solution


The "forward" direction of the proof is simple enough, I'll give it below. The converse direction is what I find a bit trickier, and I'll post it in the next post. What I'm looking for is feedback as to whether my "converse" proof is correct or not.

Going forward, take two elements [tex]x,y \in H[/tex], that is to say:

[tex]x = x_0 + z_1[/tex]

[tex]y = x_0 + z_2[/tex]

where

[tex]z_1,z_2 \in L[/tex]

(i.e., z1 and z2 are elements of the linear space L).

Then:

[tex]\alpha x + (1-\alpha) y = \alpha x_0 + \alpha z_1 + x_0 + z_2 - \alpha x_0 - \alpha z_2[/tex]

[tex]= x_0 + z_2 + \alpha z_1 - \alpha z_2[/tex]

Since

[tex]z_3 = z_2 + \alpha (z_1 - z_2) \in L[/tex]

we can write this expression as:

[tex]\alpha x + (1-\alpha)y = x_0 + z_3[/tex]

and so yes, the linear combination is in the hyperplane.

Going in the converse direction is what I'll post below, and what I'd like feedback on.
 
Last edited:
Physics news on Phys.org
  • #2
OK, so suppose we have a set H such that given [tex]x,y \in H[/tex], we know that [tex]\alpha x + (1-\alpha)y \in H[/tex] for all alpha.

The object is to prove that H is a hyperplane.

Note that this equation can be rewritten as:

[tex]y + \alpha(x-y) \in H[/tex]

So if we can prove that the difference between two elements [tex]x,y \in H[/tex] forms a linear space, we will know that H is a hyperplane.

Let x, y range over the entire set H, and consider their difference. Designate this set (i.e., the set of their differences) as M. That is:

[tex]x,y \in H \implies x-y \in M[/tex]

Clearly, [tex]0 \in M[/tex], since we can choose y=x, and x-x=0 for all x in K.

Likewise, consider [tex]x,y,z \in H[/tex]. Clearly we have:

[tex]x-y \in M[/tex]

[tex]y-z \in M[/tex]

and

[tex]x-z \in M[/tex]

all by definition. But also:

[tex]x-z = (x-y) + (y-z)[/tex]

and so given any two elements in M, the linear combination formed by adding them together is likewise an element of M.

Finally, choose [tex]x,y,z \in H[/tex]. From the condition we are given on H, we know that:

[tex]v_1 = y + \alpha (x-y) \in H[/tex]

[tex]v_2 = z + \alpha (x-z) \in H[/tex]

Now consider the difference [tex]v_1-v_2[/tex]. Clearly, since [tex]v_1, v_2 \in H[/tex], we must have:

[tex]v_1-v_2 \in M[/tex]

But likewise:

[tex]v_1 - v_2 = y-z + \alpha (z-y) = (1-\alpha)(y-z) \in M[/tex]

and this must hold for all alpha. Let [tex]1-\alpha = \alpha'[/tex], we can may now write:

[tex]y,z \in H \implies \alpha'(y-z) \in M[/tex]

which must hold for all [tex]\alpha'[/tex], and we conclude that M must be a linear space.

Since M is a linear space, we conclude that the subset H is actually a hyperplane, as desired.

My reason for posting is to see whether you agree? Is this proof good?
 

FAQ: Proving the Converse: Is H a Hyperplane?

What is a hyperplane?

A hyperplane is a subspace in a higher dimensional space, which is one dimension lower than the original space. In other words, if the original space is n-dimensional, then the hyperplane will be a (n-1)-dimensional subspace.

What is the purpose of a hyperplane in data analysis?

Hyperplanes are commonly used in data analysis to divide a dataset into distinct classes or clusters. This can help in classification or clustering tasks, where the goal is to group similar data points together. Hyperplanes are also used in support vector machines, a popular machine learning algorithm.

What is a factor space?

A factor space, also known as a quotient space, is a mathematical concept that involves dividing a space into distinct equivalence classes based on a certain relation. In other words, elements in the factor space are considered equivalent if they satisfy a specific criterion.

What is the relationship between hyperplanes and factor spaces?

Hyperplanes and factor spaces are closely related in the sense that a hyperplane can be seen as a factor space of a higher dimensional space. In other words, a hyperplane is a special case of a factor space where the equivalence classes are defined by a linear equation.

What are some real-world applications of hyperplanes and factor spaces?

Hyperplanes and factor spaces have numerous practical applications in various fields such as image and signal processing, data analysis, and machine learning. For example, in computer vision, hyperplanes can be used to separate different objects in an image. In finance, factor spaces are used in portfolio optimization to group similar stocks together.

Similar threads

Back
Top