Math Challenge - June 2020

In summary: How do you get ##i^{\frac{1}{2}}## is ##e^{\frac{-\pi}{2}}##? I'm sorry I don't understand.First of all you are wrong to take the exp in the last step. And this is why you do not understand my point. The complex logarithm is not single valued as the real logarithm is. And if you take the exp you have to take care of this. The complex logarithm is defined as follows: Given ##z=re^{it}## with ##r>0##, we define ##\ln z = \log r + it##.
  • #71
Infrared said:
Why not ##a=b=0##?
Oh yeah! I didn’t take that into account, thanks.
 
Physics news on Phys.org
  • #72
In my opinion the solution of probl. 3 by zinq is not completely formal but is it completely correct and nice. I would consider probl. 3 solved by zinq
I also think that the Baire category theorem is needed to turn zinq's argument into a formal proof
 
Last edited:
  • Like
Likes jim mcnamara
  • #73
fresh_42 said:
2. Let ##A## and ##B## be complex ##n\times n## matrices such that ##AB-BA## is a linear combination of ##A## and ##B##. Show that ##A## and ##B## must have a common eigenvector. (IR)

Thank you to @Infrared for posing this question. It made me discover some holes in my understanding of even the simpler case where ##AB=BA##. (E.g., certain conditions on the intersection of the nullspaces of ##A## and ##B## must be met.)

Anyway, I'll start with a non-general partial solution to get things moving...

For this initial simple case, assume we are given $$ AB - BA = \alpha A + \beta B ~,~~~~~ (1)$$ where the scalar coefficients ##\alpha,\beta## are non-zero.

Let ##|a\rangle## be an eigenvector of ##A## with eigenvalue ##a##, i.e., ##A |a\rangle = a |a\rangle##. Then ##|a\rangle## is also an eigenvector of ##(A+k)##, with eigenvalue ##a+k##.

Now, (1) implies $$0 ~=~ AB|a\rangle - BA|a\rangle - \alpha A |a\rangle - \beta B|a\rangle ~=~ AB|a\rangle - aB |a\rangle - a \alpha|a\rangle - \beta B|a\rangle~.~~~~~ (2)$$We need to recast (2) in the form: $$0 ~=~ A(B+v) |a\rangle ~-~ w (B+v) |a\rangle ~=~ AB|a\rangle + va|a\rangle - w (B+v) |a\rangle ~,~~~~~ (3)$$ because that means ##(B+v)|a\rangle## is an eigenvector of ##A##, with eigenvalue ##w##.

Rearranging (3) and comparing with (2), we find that we need ##w=a+\beta## and ##a\alpha = v (w-a)##, which implies ##v = a\alpha/\beta##.

Therefore, ##(B+ a\alpha/\beta) |a\rangle## is an eigenvector of ##A## with eigenvalue ##(a+\beta)##.

[Edit: I have struck out the following paragraph. Anyone following the proof above should now go straight to the later post in this thread by PeroK, and later by me, which complete the answer.]
The proof then continues (in the simple case of distinct ##A##-eigenvectors) by the standard technique of realizing that the vector ##(B+ a\alpha/\beta) |a\rangle## must be a multiple of an ##A##-eigenvector, resulting (after a shift of the eigenvalue) in an eigenvector of ##B##.

A more general solution would require a condition on the intersection of the nullspaces of ##A## and ##B##, among other things (IIUC). As an illustration of what can go wrong in the simpler case of ##AB=BA##, consider $$A = \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix} ~~~\mbox{and}~~ B = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} ~,$$ for which ##\begin{pmatrix} 1 \\ 0 \end{pmatrix}## is an eigenvector of ##B\;## but not ##A## (since that vector lies in the nullspace of ##A##).

I'll leave it at that for now, and ask the question: how far did you [ @Infrared ] envisage that the answer should go? I'm guessing you'll require total completeness to qualify as a proper answer? :oldbiggrin:
 
Last edited:
  • Like
Likes Haorong Wu and Adesh
  • #74
I want to elaborate on wrobel's point: If A = r e (r > 0) and B = x+iy are two complex numbers, then the definition of AB is the set

AB = {exp(B log(A))} as log(A) ranges over all its values. In other words

AB = {exp((ln(r) + i(θ+2nπ)) ⋅ (x + iy) | n ∈ Z), i.e.,

AB = {exp(x ln(r) - y (θ+2nπ) + i (y ln(r) + x (θ+2nπ))) | n ∈ Z}

where ln(r) = ##\int_{1}^{r} t^{-1} ~dt## as usual.
 
  • #75
By linearity of [itex]T[/itex], it suffices to show [itex]T[/itex] is continuous at [itex]0[/itex]. Take [itex](x_n) \subseteq H_1[/itex] such that [itex]x_n \xrightarrow[n\to\infty]{}0\in H_1[/itex]. Let [itex]z\in H_2[/itex], then
[tex]
\langle z,Tx_n \rangle = \langle Sz, x_n \rangle \xrightarrow[n\to\infty]{}0 \in \mathbb K.
[/tex]
As [itex]z[/itex] is arbitrary it implies [itex]Tx_n\xrightarrow[n\to\infty]{}0 \in H_2[/itex].
 
Last edited:
  • Skeptical
  • Like
Likes wrobel, member 587159 and Delta2
  • #76
nuuskur said:
By linearity of [itex]T[/itex], it suffices to show [itex]T[/itex] is continuous at [itex]0[/itex]. Take [itex](x_n) \subseteq H_1[/itex] such that [itex]x_n \xrightarrow[n\to\infty]{}0\in H_1[/itex]. Let [itex]z\in H_1[/itex], then
[tex]
\langle z,Tx_n \rangle = \langle Sz, x_n \rangle \xrightarrow[n\to\infty]{}0 \in \mathbb K.
[/tex]
As [itex]z[/itex] is arbitrary it implies [itex]Tx_n\xrightarrow[n\to\infty]{}0 \in H_2[/itex].

I think you mean ##z\in H_2##.

Please justify the step

$$\forall z \in H_2: \langle z, Tx_n\rangle\to 0\implies T x_n\to 0$$

PS: Glad to see you back here!
 
  • #77
The answer to question number 13 is ##{\Large 6}##. But right now I don’t have an analytic proof (actually margins of my paper are too small to contain the proof :cool:) . Will a graphical proof be counted? I can proof that there exist no integral solution to $$3a^3 +b^3 =6$$ by means of graph.
 
  • Like
Likes etotheipi and Delta2
  • #78
Adesh said:
The answer to question number 13 is ##{\Large 6}##. But right now I don’t have an analytic proof (actually margins of my paper are too small to contain the proof :cool:) . Will a graphical proof be counted? I can proof that there exist no integral solution to $$3a^3 +b^3 =6$$ by means of graph.

You can try to prove it using modular arithmetic. To get you started, modulo ##3## your equation becomes ##b^3=0## from which it follows that ##b## must he a multiple of ##3##.
 
  • Like
  • Informative
Likes fresh_42 and Adesh
  • #79
Math_QED said:
... from which it follows that ##b## must he a multiple of ##3##.
I will probably ask: why does that follow?

That's a quirk of mine. I fight for the correct definition which is the reason here like Don Quichote fought his windmill.
 
  • #80
Math_QED said:
You can try to prove it using modular arithmetic. To get you started, modulo ##3## your equation becomes ##b^3=0## from which it follows that ##b## must he a multiple of ##3##.
Actually, I don’t know Modular Arithemtic and I need to study it. I will do it and come with a proof.
 
  • #81
fresh_42 said:
I will probably ask: why does that follow?

That's a quirk of mine. I fight for the correct definition which is the reason here like Don Quichote fought his windmill.

##b^3 = 0 \implies b = 0## since ##\mathbb{Z}/3\mathbb{Z}## has no zero divisors (##3## is prime), but I was just giving a hint :)
 
  • #82
Math_QED said:
##b^3 = 0 \implies b = 0## since ##\mathbb{Z}/3\mathbb{Z}## has no zero divisors (##3## is prime), but I was just giving a hint :)s
I wasn't criticizing you at all. Au contraire! I liked your hint. I wanted @Adesh to read my answer and think about it ... and maybe learn the difference between prime and irreducible.
 
  • Like
Likes Adesh and member 587159
  • #83
fresh_42 said:
I wasn't criticizing you at all. I wanted @Adesh to read my answer and think about it ... and maybe learn the difference between prime and irreducible.

I didn't take it as critisism, no worries.
 
  • #84
Oh dear, I made a mistake. I will revise, @Math_QED (small world :) ). The implication in question is false. I think I was thinking about finite dimensions at the time I was writing the response :/
 
  • Like
Likes member 587159
  • #85
nuuskur said:
Oh dear, I made a mistake. I will revise, @Math_QED (small world :) ). The implication in question is false. I think I was thinking about finite dimensions at the time I was writing the response :/

No worries! It was a trap carefully set up (you are not the first to walk into it)! As a hint: Do you know about the closed graph theorem in functional analysis?
 
  • #86
nuuskur said:
Ok, if I understand the closed graph theorem correctly, we get ..
It's equivalent to show the graph of [itex]T[/itex] is closed. Take [itex](y_n,Ty_n)\in \mathrm{gr}T,\ n\in\mathbb N,[/itex] such that
[tex]
y_n\xrightarrow [n\to\infty]{H_1} y,\quad Ty_n \xrightarrow [n\to\infty]{H_2} x.
[/tex]
By linearity of [itex]T[/itex] we have
[tex]
\|Ty-Ty_n\|^2 = \langle Ty - Ty_n, Ty-Ty_n \rangle = \langle ST(y-y_n), y-y_n \rangle \xrightarrow [n\to\infty]{} 0.
[/tex]
Thus [itex]Ty_n \to Ty[/itex] i.e [itex]Ty = x[/itex].
I think I see why the closed graph theorem is so useful here. We can assume without loss that the [itex]Ty_n[/itex] converge.

$$\langle ST(y-y_n), y-y_n \rangle \xrightarrow [n\to\infty]{} 0$$

Why is this?
 
  • #87
Math_QED said:
$$\langle ST(y-y_n), y-y_n \rangle \xrightarrow [n\to\infty]{} 0$$

Why is this?
I think I made a similar mistake :/ my thoughts were
[tex]
\langle STz_n, z_n \rangle \leq \|ST\| \|z_n\|^2 \xrightarrow [n\to\infty]{}0,
[/tex]
but that would make sense if [itex]ST[/itex] was continuous and it needn't be :(
 
  • #88
nuuskur said:
I think I made a similar mistake :/ my thoughts were
[tex]
\langle STz_n, z_n \rangle \leq \|ST\| \|z_n\|^2 \xrightarrow [n\to\infty]{}0,
[/tex]
but that would make sense if [itex]ST[/itex] was continuous and it needn't be :(

Yes, exactly. However, the solution I wrote down is not much longer than your attempt, so maybe try another attempt :)
 
  • #89
strangerep said:
Therefore, ##(B+ a\alpha/\beta) |a\rangle## is an eigenvector of ##A## with eigenvalue ##(a+\beta)##.

I'm having a little bit of trouble following this: If I apply ##A##, I get:

##A(B+ a\alpha/\beta) |a\rangle=AB|a\rangle+\frac{a\alpha}{\beta}A|a\rangle=AB|a\rangle+\frac{a^2\alpha}{\beta}|a\rangle,##

which I don't see how to simplify to ##(a+\beta)|a\rangle.## You can substitute ##AB=BA+\alpha A+\beta B##, but it doesn't look like your terms cancel.
strangerep said:
A more general solution would require a condition on the intersection of the nullspaces of ##A## and ##B##, among other things (IIUC). As an illustration of what can go wrong in the simpler case of ##AB=BA##, consider $$A = \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix} ~~~\mbox{and}~~ B = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} ~,$$ for which ##\begin{pmatrix} 1 \\ 0 \end{pmatrix}## is an eigenvector of ##B\;## but not ##A## (since that vector lies in the nullspace of ##A##).

Actually, this is allowed! The nullspace of a matrix is the same thing as its ##0##-eigenspace. Eigenvectors are not allowed to be zero, but eigenvalues are.
 
  • #90
Here is what I'd say is a simpler answer to question 3. than the one I posted in #66:

Let the vector space be H = the separable infinite-dimensional Hilbert space of square-summable sequences of real numbers, and let the convex subset be R, defined as the increasing union

R = ##\bigcup_{n=1}^\infty## Rn

where Rn = Rn × {0} ⊂ Rn+1.


Clearly R is convex. R contains only points with finitely many nonzero components, so it is not all of H.

Any half-space D of H is of the form D = {x ∈ H | u ⋅ x ≤ c} where u is a fixed unit vector in H and c is a fixed real number. Given this half-space D, we show that R is not contained in it:

Let B = {en | n ∈ Z+} denote an orthonormal basis of H, and define the set C via C = ±B = {±en | n ∈ Z+}. Clearly an element of C making the smallest possible angle with the unit vector u is < 90º from u. Call this element eu. Then the line

R ⋅ eu

is entirely contained in R but is not entirely contained in the half-space D. Since D was arbitrary, this shows R is not contained in any half-space.
 
Last edited:
  • #91
Infrared said:
I'm having a little bit of trouble following this: If I apply ##A##, I get:

##A(B+ a\alpha/\beta) |a\rangle=AB|a\rangle+\frac{a\alpha}{\beta}A|a\rangle=AB|a\rangle+\frac{a^2\alpha}{\beta}|a\rangle,##

which I don't see how to simplify to ##(a+\beta)|a\rangle.## You can substitute ##AB=BA+\alpha A+\beta B##, but it doesn't look like your terms cancel.
I get: $$\begin{array}{l}
A(B+ a\alpha/\beta) |a\rangle - (a+\beta)(B+ a\alpha/\beta)|a\rangle \\~~
~=~ (BA + \alpha A + \beta B)|a\rangle + a^2 \alpha/\beta |a\rangle ~-~ (a+\beta)(B+ a\alpha/\beta)|a\rangle \\~~
~=~ (aB + a \alpha + \beta B)|a\rangle + a^2 \alpha/\beta |a\rangle ~-~ (a+\beta)(B+ a\alpha/\beta)|a\rangle\\~~
~=~ a \alpha |a\rangle + a^2 \alpha/\beta |a\rangle ~-~ (a+\beta) a\alpha/\beta|a\rangle\\
~~ ~=~ 0 ~.
\end{array}
$$
 
  • Like
Likes PeroK
  • #92
I can't live with myself if I don't solve this problem. This is becoming personal.
By theorem of closed graph, assume
[tex]
y_n \xrightarrow[n\to\infty]{H_1}y\qquad Ty_n \xrightarrow[n\to\infty]{H_2}x.
[/tex]
Was it really this simple all along ?!
[tex]
\begin{align*}
\|x-Ty\|^2 = \langle x-Ty,x-Ty \rangle &= \langle x-Ty, x \rangle - \langle x-Ty, Ty \rangle \\
&= \lim\langle x-Ty, Ty_n \rangle - \langle x-Ty,Ty \rangle \\
&= \lim \langle S(x-Ty), y_n \rangle - \langle x-Ty,Ty \rangle \\
&= \langle S(x-Ty), y \rangle - \langle x-Ty,Ty \rangle \\
&= \langle x-Ty,Ty \rangle - \langle x-Ty,Ty \rangle = 0.
\end{align*}
[/tex]
Thus [itex]Ty=x[/itex] and [itex]T[/itex] is continuous.
 
  • Like
  • Haha
  • Love
Likes Delta2, etotheipi, member 587159 and 1 other person
  • #93
nuuskur said:
I can't live with myself if I don't solve this problem. This is becoming personal.
By theorem of closed graph, assume
[tex]
y_n \xrightarrow[n\to\infty]{H_1}y\qquad Ty_n \xrightarrow[n\to\infty]{H_2}x.
[/tex]
Was it really this simple all along ?!
[tex]
\begin{align*}
\|x-Ty\|^2 = \langle x-Ty,x-Ty \rangle &= \langle x-Ty, x \rangle - \langle x-Ty, Ty \rangle \\
&= \lim\langle x-Ty, Ty_n \rangle - \langle x-Ty,Ty \rangle \\
&= \lim \langle S(x-Ty), y_n \rangle - \langle x-Ty,Ty \rangle \\
&= \langle S(x-Ty), y \rangle - \langle x-Ty,Ty \rangle \\
&= \langle x-Ty,Ty \rangle - \langle x-Ty,Ty \rangle = 0.
\end{align*}
[/tex]
Thus [itex]Ty=x[/itex] and [itex]T[/itex] is continuous.
I appreciate and share that feeling.
 
  • #94
nuuskur said:
I can't live with myself if I don't solve this problem. This is becoming personal.
By theorem of closed graph, assume
[tex]
y_n \xrightarrow[n\to\infty]{H_1}y\qquad Ty_n \xrightarrow[n\to\infty]{H_2}x.
[/tex]
Was it really this simple all along ?!
[tex]
\begin{align*}
\|x-Ty\|^2 = \langle x-Ty,x-Ty \rangle &= \langle x-Ty, x \rangle - \langle x-Ty, Ty \rangle \\
&= \lim\langle x-Ty, Ty_n \rangle - \langle x-Ty,Ty \rangle \\
&= \lim \langle S(x-Ty), y_n \rangle - \langle x-Ty,Ty \rangle \\
&= \langle S(x-Ty), y \rangle - \langle x-Ty,Ty \rangle \\
&= \langle x-Ty,Ty \rangle - \langle x-Ty,Ty \rangle = 0.
\end{align*}
[/tex]
Thus [itex]Ty=x[/itex] and [itex]T[/itex] is continuous.

Well done!
 
  • #95
Since [itex]f[/itex] is unbounded, pick [itex]x_n\in X [/itex] such that [itex]|f(x_n)|\geq n\|x_n\|,n\in\mathbb N[/itex]. Without loss of generality, assume [itex]\|x_n\| \equiv 1[/itex] so we have [itex]|f(x_n)| \geq n[/itex]. Fix [itex]x\in X[/itex]. Define
[tex]
y_n := x - \frac{f(x)}{f(x_n)}x_n, n\in\mathbb N.
[/tex]
One readily verifies the [itex]y_n\in\mathrm{Ker}f[/itex]. We also see that [itex]\|y_n-x\| = \left\lvert\frac{f(x)}{f(x_n)} \right\rvert \xrightarrow[n\to\infty]{}0[/itex]. Thus [itex]y_n\to x[/itex] and [itex]x\in\overline{\mathrm{Ker} f}[/itex].
We know [itex]f[/itex] must be non-zero, thus its image is [itex]\mathbb K[/itex] i.e [itex]\dim\mathrm{Im} f = 1[/itex]. We also know the kernel is closed if and only if [itex]f[/itex] is continuous, thus [itex]\mathrm{Ker}f \neq \overline{\mathrm{Ker}f}[/itex]. Since [itex]X \cong \mathrm{Ker} f \oplus \mathrm{Im}f[/itex] we see
[tex]
\dim X - \dim \overline{\mathrm{Ker}f} < \dim X - \dim \mathrm{Ker}f = 1.
[/tex]
So it must be that [itex]X = \overline{\mathrm{Ker}f}[/itex].
 
Last edited:
  • Like
Likes member 587159
  • #96
nuuskur said:
Since [itex]f[/itex] is unbounded, pick [itex]x_n\in X [/itex] such that [itex]|f(x_n)|\geq n\|x_n\|,n\in\mathbb N[/itex]. Without loss of generality, assume [itex]\|x_n\| \equiv 1[/itex] so we have [itex]|f(x_n)| \geq n[/itex]. Fix [itex]x\in X[/itex]. Define
[tex]
y_n := x - \frac{f(x)}{f(x_n)}x_n, n\in\mathbb N.
[/tex]
One readily verifies the [itex]y_n\in\mathrm{Ker}f[/itex]. We also see that [itex]\|y_n-x\| = \left\lvert\frac{f(x)}{f(x_n)} \right\rvert \xrightarrow[n\to\infty]{}0[/itex]. Thus [itex]y_n\to x[/itex] and [itex]x\in\overline{\mathrm{Ker} f}[/itex].

Why is ##y_n \in \ker f##?
 
  • #97
Math_QED said:
Why is ##y_n \in \ker f##?
By linearity of [itex]f[/itex] :olduhh:
[tex]
f\left ( x - \frac{f(x)}{f(x_n)}x_n \right ) = f(x) - f \left ( \frac{f(x)}{f(x_n)}x_n \right ) = f(x) - f(x) = 0.
[/tex]
 
  • #98
nuuskur said:
By linearity of [itex]f[/itex] :olduhh:
[tex]
f\left ( x - \frac{f(x)}{f(x_n)}x_n \right ) = f(x) - f \left ( \frac{f(x)}{f(x_n)}x_n \right ) = f(x) - f(x) = 0.
[/tex]

Yes, obviously. Sorry I missed a factor when I read it first. The solution seems correct to me but I'm not the one moderating it.
 
  • #99
strangerep said:
I get: $$\begin{array}{l}
A(B+ a\alpha/\beta) |a\rangle - (a+\beta)(B+ a\alpha/\beta)|a\rangle \\~~
~=~ (BA + \alpha A + \beta B)|a\rangle + a^2 \alpha/\beta |a\rangle ~-~ (a+\beta)(B+ a\alpha/\beta)|a\rangle \\~~
~=~ (aB + a \alpha + \beta B)|a\rangle + a^2 \alpha/\beta |a\rangle ~-~ (a+\beta)(B+ a\alpha/\beta)|a\rangle\\~~
~=~ a \alpha |a\rangle + a^2 \alpha/\beta |a\rangle ~-~ (a+\beta) a\alpha/\beta|a\rangle\\
~~ ~=~ 0 ~.
\end{array}
$$

I think you can finish this off by:

I'm going to switch notation:

If ##v_1## is an eigenvector of ##A## with eigenvalue ##\lambda_1##, then:
$$v_2 = [B + \frac {\lambda_1 \alpha}{\beta}I]v_1$$ Is an eigenvector of ##A## with eigenvalue ##\lambda_2 = \lambda + \beta##. And:
$$v_3 = [B + \frac {\lambda_2 \alpha}{\beta}I]v_2$$ Is an eigenvector of ##A## with eigenvalue ##\lambda_3 = \lambda_2 + \beta##.

This generates an infinite sequence of distinct eigenvalues, unless for some ##k## we have:
$$v_{k+1} = [B + \frac {\lambda_k \alpha}{\beta}I]v_k = 0$$ In which case, ##v_k## is a common eigenvector of ##A## and ##B + \frac {\lambda_k \alpha}{\beta}I##, hence also an eigenvector of ##B##.

We also have the case when ##\beta = 0## and:
$$AB - BA = \alpha A$$ In whch case:
$$A[B + \frac{\lambda \alpha }{\lambda - 1}I]v = \lambda [B + \frac{\lambda \alpha}{\lambda - 1}I]v \ \ (\lambda \ne 1)$$
In which case, ##C = [B + \frac{\lambda \alpha}{\lambda - 1}I]v## is an eigenvector of ##A## with eigenvalue ##\lambda##. Hence ##C## maps the ##\lambda## eigenspace of ##A## into itself and must have a eigenvector restricted to this space. Therefore, ##C## and hence ##B## have a common eigenvector with ##A##.

Finally, if ##A## only has eigenvalues ##\lambda = 1##, then ##A' = 2A## has the same eigenvectors of ##A## and the result follows as above.
 
Last edited:
  • #100
Math_QED said:
Well done!

How do we know that the limit ##x = \lim Ty_n## exists?
 
  • #101
PeroK said:
How do we know that the limit ##x = \lim Ty_n## exists?
This is what the closed graph theorem tells us. We can assume without loss of generality that the sequence of images also converges.
 
  • #102
PeroK said:
How do we know that the limit ##x = \lim Ty_n## exists?

The closed graph theorem allows us to assume that. That's why it is a useful theorem. You must show ##y_n \to y## implies ##Ty_n \to Ty ## to deduce continuity but the theorem says you can already assume that ##(Ty_n)_n## converges, which makes life easier.
 
  • #103
nuuskur said:
Since [itex]f[/itex] is unbounded, pick [itex]x_n\in X [/itex] such that [itex]|f(x_n)|\geq n\|x_n\|,n\in\mathbb N[/itex]. Without loss of generality, assume [itex]\|x_n\| \equiv 1[/itex] so we have [itex]|f(x_n)| \geq n[/itex]. Fix [itex]x\in X[/itex]. Define
[tex]
y_n := x - \frac{f(x)}{f(x_n)}x_n, n\in\mathbb N.
[/tex]
One readily verifies the [itex]y_n\in\mathrm{Ker}f[/itex]. We also see that [itex]\|y_n-x\| = \left\lvert\frac{f(x)}{f(x_n)} \right\rvert \xrightarrow[n\to\infty]{}0[/itex]. Thus [itex]y_n\to x[/itex] and [itex]x\in\overline{\mathrm{Ker} f}[/itex].
We know [itex]f[/itex] must be non-zero, thus its image is [itex]\mathbb K[/itex] i.e [itex]\dim\mathrm{Im} f = 1[/itex]. We also know the kernel is closed if and only if [itex]f[/itex] is continuous, thus [itex]\mathrm{Ker}f \neq \overline{\mathrm{Ker}f}[/itex]. Since [itex]X = \mathrm{Ker} f \oplus \mathrm{Im}f[/itex] we see
[tex]
\dim X - \dim \overline{\mathrm{Ker}f} < \dim X - \dim \mathrm{Ker}f = 1.
[/tex]
So it must be that [itex]X = \overline{\mathrm{Ker}f}[/itex].

The second appproach is how I solved it as well! However, some care has to be taken because if ##X## is infinite dimensional you wrote ##\infty-\infty##.
 
  • #104
nuuskur said:
This is what the closed graph theorem tells us. We can assume without loss of generality that the sequence of images also converges.

Math_QED said:
The closed graph theorem allows us to assume that. That's why it is a useful theorem. You must show ##y_n \to y## implies ##Ty_n \to Ty ## to deduce continuity but the theorem says you can already assume that ##(Ty_n)_n## converges, which makes life easier.

For any linear operator?
 
  • #105
PeroK said:
For any linear operator?
Yes. If the domain and codomain are complete spaces, then the proposition holds. In problem 1, it is applicable.
 

Similar threads

Replies
33
Views
8K
2
Replies
61
Views
11K
4
Replies
137
Views
17K
2
Replies
60
Views
9K
2
Replies
61
Views
9K
2
Replies
61
Views
7K
3
Replies
100
Views
9K
3
Replies
104
Views
15K
3
Replies
80
Views
7K
2
Replies
52
Views
11K
Back
Top