The Metric Space R^n and Sequences .... Remark by Carothers, page 47 ....

In summary: R} ... ... I think we may proceed as follows where z = x - y ...\| z \mid \mid_2 \ = \| \sum_{ j = 1}^n z_j e_j \| \leq \sum_{ j = 1}^n \| z_j e_j \| = \sum_{ j = 1}^n \mid z_j \mid \| e_j \| = \sum_{ j = 1}^n \mid z_j \mid Thus \|
  • #1
Math Amateur
Gold Member
MHB
3,998
48
I am reading N. L. Carothers' book: "Real Analysis". ... ...

I am focused on Chapter 3: Metrics and Norms ... ...

I need help with a remark by Carothers concerning convergent sequences in \mathbb{R}^n ...Now ... on page 47 Carothers writes the following:
View attachment 9216
In the above text from Carothers we read the following:

" ... ... it follows that a sequence of vectors \(\displaystyle x^{ (k) } = ( x_1^k, \ ... \ ... \ , x_n^k)\) in \(\displaystyle \mathbb{R}^n\) converges (is Cauchy) if and only if each of the coordinate sequences \(\displaystyle ( x_j^k )\) converges in \(\displaystyle \mathbb{R}\) ... ... "
My question is as follows:

Why exactly does it follow that a sequence of vectors \(\displaystyle x^{ (k) } = ( x_1^k, \ ... \ ... \ , x_n^k)\) in \(\displaystyle \mathbb{R}^n\) converges (is Cauchy) if and only if each of the coordinate sequences \(\displaystyle ( x_j^k )\) converges in \(\displaystyle \mathbb{R}\) ... ... ?
Help will be appreciated ...

Peter
 

Attachments

  • Carothers - Remarks on R^n as a metric space .... Page 47 .png
    Carothers - Remarks on R^n as a metric space .... Page 47 .png
    12.6 KB · Views: 116
Physics news on Phys.org
  • #2
it may be convenient to switch norms slightly here...
in particular with

$\mathbf z:= \mathbf x - \mathbf y$

and $\mathbf z \in \mathbb R^n$

convince yourself that
$\big \Vert \mathbf z \big \Vert_2 \leq \big \Vert \mathbf z \big \Vert_1 \leq \sqrt{n}\cdot \big \Vert \mathbf z \big \Vert_2$

where the first inequality is triangle inequality and 2nd one is cauchy-schwarz (with 1's trick). to a large extent the 1 norm allows you linearize the distance computed on each component... can you prove the Carothers comment of convergence iff each $x_k$ converges in $\mathbb R$ using the 1 norm? The first leg should be easy -- select $\frac{\epsilon}{n}$ for each component and import favorite single variable real analysis results. The second leg is similar...
- - - -
Then using the above chain of inequalities this givess you the result with the 2 norm / standard metric on x,y.
 
  • #3
steep said:
it may be convenient to switch norms slightly here...
in particular with

$\mathbf z:= \mathbf x - \mathbf y$

and $\mathbf z \in \mathbb R^n$

convince yourself that
$\big \Vert \mathbf z \big \Vert_2 \leq \big \Vert \mathbf z \big \Vert_1 \leq \sqrt{n}\cdot \big \Vert \mathbf z \big \Vert_2$

where the first inequality is triangle inequality and 2nd one is cauchy-schwarz (with 1's trick). to a large extent the 1 norm allows you linearize the distance computed on each component... can you prove the Carothers comment of convergence iff each $x_k$ converges in $\mathbb R$ using the 1 norm? The first leg should be easy -- select $\frac{\epsilon}{n}$ for each component and import favorite single variable real analysis results. The second leg is similar...
- - - -
Then using the above chain of inequalities this givess you the result with the 2 norm / standard metric on x,y.
Thanks for the help steep ...

Will try to prove the following based on your advice ...

... a sequence of vectors \(\displaystyle x^{ (k) } = ( x_1^k, \ ... \ ... \ , x_n^k)\) in \(\displaystyle \mathbb{R}^n\) converges if and only if each of the coordinate sequences \(\displaystyle ( x_j^k )\) converges in \(\displaystyle \mathbb{R}\) ... ... I think we may proceed as follows where \(\displaystyle z = x - y\) ...\(\displaystyle \| z \mid \mid_2 \ = \| \sum_{ j = 1}^n z_j e_j \| \leq \sum_{ j = 1}^n \| z_j e_j \| = \sum_{ j = 1}^n \mid z_j \mid \| e_j \| = \sum_{ j = 1}^n \mid z_j \mid \)Thus \(\displaystyle \| x - y \| = \left( \sum_{ j = 1}^n \mid x_j - y_j \mid^2 \right)^{ \frac{1}{2} } \leq \sum_{ j = 1}^n \mid x_j - y_j \mid \) ... ... ... (1)Now ... given (1) above ...

... if \(\displaystyle ( x_j^k)_{ k = 1}^{ \infty }\) converges to a limit \(\displaystyle y \in \mathbb{R}^n\) ...

... then for every \(\displaystyle \frac{ \epsilon }{ n } \gt 0 \ \exists \ N(j ; \epsilon )\) such that for \(\displaystyle k \geq N(j ; \epsilon )\) ...

... we have \(\displaystyle \mid x_j^k - y \mid \lt \frac{ \epsilon }{ n }\) ...But then we have \(\displaystyle x^{ (k) }\) converges to \(\displaystyle y\) since ...

... for every \(\displaystyle \epsilon \gt 0 \ \exists \ N( \epsilon )\) such that for \(\displaystyle k \geq N( \epsilon )\) we have ...

... \(\displaystyle \| x^{ (k) } - y \| \leq \sum_{ j = 1}^n \mid x_j^k - y \mid \lt \frac{ \epsilon }{ n } + \ ... \ ... \ \frac{ \epsilon }{ n } = \epsilon\)
Is that correct?Proof is similar for Cauchy sequences in \mathbb{R}^n ...Peter
 
Last edited:
  • #4
Peter said:
Thanks for the help steep ...

Will try to prove the following based on your advice ...

... a sequence of vectors \(\displaystyle x^{ (k) } = ( x_1^k, \ ... , x_n^k)\) in \(\displaystyle \mathbb{R}^n\) converges if and only if each of the coordinate sequences \(\displaystyle ( x_j^k )\) converges in \(\displaystyle \mathbb{R}\) ... ... I think we may proceed as follows where \(\displaystyle z = x - y\) ...\(\displaystyle \| z \mid \mid_2 \ = \| \sum_{ j = 1}^n z_j e_j \| \leq \sum_{ j = 1}^n \| z_j e_j \| = \sum_{ j = 1}^n \mid z_j \mid \| e_j \| = \sum_{ j = 1}^n \mid z_j \mid \)Thus \(\displaystyle \| x - y \| = \left( \sum_{ j = 1}^n \mid x_j - y_j \mid^2 \right)^{ \frac{1}{2} } \leq \sum_{ j = 1}^n \mid x_j - y_j \mid \) ... ... ... (1)Now ... given (1) above ...

... if \(\displaystyle ( x_j^k)_{ k = 1}^{ \infty }\) converges to a limit \(\displaystyle y \in \mathbb{R}^n\) ...

... then for every \(\displaystyle \frac{ \epsilon }{ n } \gt 0 \ \exists \ N(j ; \epsilon )\) such that for \(\displaystyle k \geq N(j ; \epsilon )\) ...

... we have \(\displaystyle \mid x_j^k - y \mid \lt \frac{ \epsilon }{ n }\) ...But then we have \(\displaystyle x^{ (k) }\) converges to \(\displaystyle y\) since ...

... for every \(\displaystyle \epsilon \gt 0 \ \exists \ N( \epsilon )\) such that for \(\displaystyle k \geq N( \epsilon )\) we have ...

... \(\displaystyle \| x^{ (k) } - y \| \leq \sum_{ j = 1}^n \mid x_j^k - y \mid \lt \frac{ \epsilon }{ n } + \ ... \ ... \ \frac{ \epsilon }{ n } = \epsilon\)
Is that correct?Proof is similar for Cauchy sequences in \mathbb{R}^n ...Peter

I think this is basically right. You may be approaching it in a more succinct manner than I am... I have this in my head as 2 steps, first sufficiency, then necessity. The above clearly gives sufficiency. I'm not sure I saw the second leg, necessity, though.

Another way to get it is to use the infinity /max norm, so

$\big\Vert \mathbf z \big \Vert_\infty^2 = \max\big(z_1^2, z_1^2, ..., z_n^2\big) \leq \sum_{i=1}^n z_i^2 = \big \Vert \mathbf z\big \Vert_2^2$
hence taking square roots over non-negative numbers gives

$\big\Vert \mathbf z \big \Vert_\infty \leq \big \Vert \mathbf z\big \Vert_2$

so for the second leg, suppose that there is (at least one) coordinate j that doesn't converge -- i.e. where there is no N large enough such that $\vert z_j^{(n)} \vert \lt \epsilon_0$, for all $n\geq N$... then if the sequence still convergences you'd have

$\epsilon_0 \leq \vert z_j^{(n)} \vert \leq \big \Vert \mathbf z^{(n)}\big \Vert_\infty \leq \big \Vert \mathbf z^{(n)}\big \Vert_2 \lt \epsilon$
for some $n \geq N$ for any $N$. Selecting $\epsilon := \epsilon_0$ then contradicts the definition of convergence.

There's probably a slightly nicer way of showing it, but this is at the heart of the necessity of component-wise convergence.
 
  • #5
steep said:
I think this is basically right. You may be approaching it in a more succinct manner than I am... I have this in my head as 2 steps, first sufficiency, then necessity. The above clearly gives sufficiency. I'm not sure I saw the second leg, necessity, though.

Another way to get it is to use the infinity /max norm, so

$\big\Vert \mathbf z \big \Vert_\infty^2 = \max\big(z_1^2, z_1^2, ..., z_n^2\big) \leq \sum_{i=1}^n z_i^2 = \big \Vert \mathbf z\big \Vert_2^2$
hence taking square roots over non-negative numbers gives

$\big\Vert \mathbf z \big \Vert_\infty \leq \big \Vert \mathbf z\big \Vert_2$

so for the second leg, suppose that there is (at least one) coordinate j that doesn't converge -- i.e. where there is no N large enough such that $\vert z_j^{(n)} \vert \lt \epsilon_0$, for all $n\geq N$... then if the sequence still convergences you'd have

$\epsilon_0 \leq \vert z_j^{(n)} \vert \leq \big \Vert \mathbf z^{(n)}\big \Vert_\infty \leq \big \Vert \mathbf z^{(n)}\big \Vert_2 \lt \epsilon$
for some $n \geq N$ for any $N$. Selecting $\epsilon := \epsilon_0$ then contradicts the definition of convergence.

There's probably a slightly nicer way of showing it, but this is at the heart of the necessity of component-wise convergence.
Hi steep ...

Thanks again for your considerable assistance ... Thought I would try a direct approach to demonstrate that ...

... if a sequence of vectors \(\displaystyle x^{ (k) } = ( x_1^k, \ ... \ ... \ , x_n^k)\) in \(\displaystyle \mathbb{R}^n\) converges ...

... then ... each of the coordinate sequences \(\displaystyle ( x_j^k )\) converges in \(\displaystyle \mathbb{R}\) ... ... Proceed as follows, where \(\displaystyle z = x - y\) ...\(\displaystyle \mid z_j \mid = \| z_j \|_2 = \sum_{ j = 1}^1 (z_j^2)^{ \frac{1}{2} } = \left( \sum_{ j = 1}^1 z_j^2) \right)^{ \frac{1}{2} } \leq \left( \sum_{ j = 1}^n z_j^2) \right)^{ \frac{1}{2} } = \| z \|_2\) ... ... ... ... ... (2)Given (2) above ... we have ...

... if \(\displaystyle x^{ (k) }\) converges to \(\displaystyle y\) in \(\displaystyle \mathbb{R}^n\) ...

... then ... for every \(\displaystyle \epsilon \gt 0 \ \exists \ N( \epsilon )\) such that for \(\displaystyle k \geq N( \epsilon )\) we have ...

... \(\displaystyle \| x^{ (k) } - y \|_2 \leq \epsilon\) ...But then, for arbitrary j, we have \(\displaystyle ( x_j^k)_{ k = 1}^{ \infty }\) converges to a limit \(\displaystyle y_j \in \mathbb{R}\) ...

... since for every \(\displaystyle \epsilon \gt 0 \ \exists \ N( \epsilon )\) such that for \(\displaystyle k \geq N( \epsilon )\) we have ...

\(\displaystyle \mid x_j -y_j \mid = \| x_j -y_j \|_2 \leq \| x^{ (k) } - y \|_2 \leq \epsilon\) ...
Is that correct?
Thanks once again for the help!

Peter
 
  • #6
Peter said:
...
But then, for arbitrary j, we have \(\displaystyle ( x_j^k)_{ k = 1}^{ \infty }\) converges to a limit \(\displaystyle y_j \in \mathbb{R}\) ...

... since for every \(\displaystyle \epsilon \gt 0 \ \exists \ N( \epsilon )\) such that for \(\displaystyle k \geq N( \epsilon )\) we have ...

\(\displaystyle \mid x_j -y_j \mid = \| x_j -y_j \|_2 \leq \| x^{ (k) } - y \|_2 \leq \epsilon\) ...

re-reading this thread with a fresh set of eyes I see that your post 3 really was

$\text{convergence in each component} \longrightarrow \text{convergence of vector in } \mathbb R^n$

and this current post 5 is the other leg
$\text{convergence of vector in } \mathbb R^n \longrightarrow \text{convergence in each component} $

and yes I think it works. The only nitpick I'll do is to make sure to use strictness in the inequality with the epsilon, i.e.
$\| x^{ (k) } - y \|_2 \lt \epsilon$

any other items are immaterial... and looking back through my posts it looks like I overloaded $n$ both for the dimension in reals as well as the number in the sequence, so no need to nitpick too much ;)
 
  • #7
steep said:
re-reading this thread with a fresh set of eyes I see that your post 3 really was

$\text{convergence in each component} \longrightarrow \text{convergence of vector in } \mathbb R^n$

and this current post 5 is the other leg
$\text{convergence of vector in } \mathbb R^n \longrightarrow \text{convergence in each component} $

and yes I think it works. The only nitpick I'll do is to make sure to use strictness in the inequality with the epsilon, i.e.
$\| x^{ (k) } - y \|_2 \lt \epsilon$

any other items are immaterial... and looking back through my posts it looks like I overloaded $n$ both for the dimension in reals as well as the number in the sequence, so no need to nitpick too much ;)
Thanks for all your help, steep ...

I really appreciate it ...

Peter
 

Similar threads

Back
Top