- #71
suremarc
- 147
- 65
I didn't notice there was one more problem, so here's my go at it.fresh_42 said:5. Let ##A\in \mathbb{M}(n,\mathbb{R})## be a real square matrix. Show that there is a parameterized path ##x\, : \,\mathbb{R}\longrightarrow \mathbb{R}^n## as solution of the differential equation ##\dot x(t)=Ax(t)## which is unique for any given initial condition ##x(t_0)=x_0.##
Claim. ##\mathbf{x}(t)## is analytic on ##\mathbb{R}##.
Proof. From ##\dot{\mathbf{x}}(t)=A\mathbf{x}(t)##, it immediately follows that ##\mathbf{x}^{(k)}(t)=A^k\,\mathbf{x}(t)##. Introduce the operator norm ##\|A\|=\mathrm{sup}_{\mathbf{x}\in\mathbb{R}^n}\frac{\|A\mathbf{x}\|}{\|\mathbf{x}\|}##, which is a subadditive and submultiplicative norm. It follows that $$e^{(t-t_0)A}\,\mathbf{x}(t_0)=\sum_{k=0}^\infty \frac{\mathbf{x}^{(k)}(t_0)}{k!}(t-t_0)^k$$ exists for all ##t,t_0\in\mathbb{R}##, with convergence of ##e^{(t-t_0)A}## provided by the operator norm, and convergence of the RHS provided by the standard norm on ##\mathbb{R}^n##, respectively (note that ##\|\mathbf{x}^{(k)}(t_0)\|\leq\|A\|^k\|\mathbf{x}(t_0)\|##). One can check that this is a solution of the differential equation ##\dot{\mathbf{x}}(t)=A\,\mathbf{x}(t)##.
We now prove that ##\mathbf{x}(t)=e^{(t-t_0)A}\,\mathbf{x}(t_0)##: to be more precise, $$(\forall t,t_0\in\mathbb{R})\lim_{N\rightarrow\infty}\Bigg(\mathbf{x}(t)-\sum_{k=0}^N\frac{\mathbf{x}^{(k)}(t_0)}{k!}(t-t_0)^k\Bigg)=0.$$ By Taylor's theorem, one has that ##\mathbf{x}(t)=\Big(\sum_{k=0}^N\frac{\mathbf{x}^{(k)}(t_0)}{k!}(t-t_0)^k\Big)+\mathbf{r}_N(t)##, where $$(\exists\xi\in\mathbb{R})\,\,\mathbf{r}_N(t)=\frac{\mathbf{x}^{(N+1)}(\xi)}{(N+1)!}(t-t_0)^{N+1}$$ for some ##\xi## between ##t_0## and ##t##. Since ##\mathbf{x}^{(N+1)}(\xi)=A^{N+1}\,\mathbf{x}(\xi)##, we may take the operator norm to get $$\|\mathbf{r}_N(t)\|\leq\frac{\|A\|^{(N+1)}\|\mathbf{x}(\xi)\|}{(N+1)!}|t-t_0|^{N+1}$$ and from this one clearly sees that ##\lim_{N\rightarrow\infty}\mathbf{r}_N(t)=0##. Thus the series for ##e^{(t-t_0)A}\,\mathbf{x}(t_0)## converges pointwise to ##\mathbf{x}(t)##.
Proof. From ##\dot{\mathbf{x}}(t)=A\mathbf{x}(t)##, it immediately follows that ##\mathbf{x}^{(k)}(t)=A^k\,\mathbf{x}(t)##. Introduce the operator norm ##\|A\|=\mathrm{sup}_{\mathbf{x}\in\mathbb{R}^n}\frac{\|A\mathbf{x}\|}{\|\mathbf{x}\|}##, which is a subadditive and submultiplicative norm. It follows that $$e^{(t-t_0)A}\,\mathbf{x}(t_0)=\sum_{k=0}^\infty \frac{\mathbf{x}^{(k)}(t_0)}{k!}(t-t_0)^k$$ exists for all ##t,t_0\in\mathbb{R}##, with convergence of ##e^{(t-t_0)A}## provided by the operator norm, and convergence of the RHS provided by the standard norm on ##\mathbb{R}^n##, respectively (note that ##\|\mathbf{x}^{(k)}(t_0)\|\leq\|A\|^k\|\mathbf{x}(t_0)\|##). One can check that this is a solution of the differential equation ##\dot{\mathbf{x}}(t)=A\,\mathbf{x}(t)##.
We now prove that ##\mathbf{x}(t)=e^{(t-t_0)A}\,\mathbf{x}(t_0)##: to be more precise, $$(\forall t,t_0\in\mathbb{R})\lim_{N\rightarrow\infty}\Bigg(\mathbf{x}(t)-\sum_{k=0}^N\frac{\mathbf{x}^{(k)}(t_0)}{k!}(t-t_0)^k\Bigg)=0.$$ By Taylor's theorem, one has that ##\mathbf{x}(t)=\Big(\sum_{k=0}^N\frac{\mathbf{x}^{(k)}(t_0)}{k!}(t-t_0)^k\Big)+\mathbf{r}_N(t)##, where $$(\exists\xi\in\mathbb{R})\,\,\mathbf{r}_N(t)=\frac{\mathbf{x}^{(N+1)}(\xi)}{(N+1)!}(t-t_0)^{N+1}$$ for some ##\xi## between ##t_0## and ##t##. Since ##\mathbf{x}^{(N+1)}(\xi)=A^{N+1}\,\mathbf{x}(\xi)##, we may take the operator norm to get $$\|\mathbf{r}_N(t)\|\leq\frac{\|A\|^{(N+1)}\|\mathbf{x}(\xi)\|}{(N+1)!}|t-t_0|^{N+1}$$ and from this one clearly sees that ##\lim_{N\rightarrow\infty}\mathbf{r}_N(t)=0##. Thus the series for ##e^{(t-t_0)A}\,\mathbf{x}(t_0)## converges pointwise to ##\mathbf{x}(t)##.
I wonder if Taylor's theorem applies in the infinite-dimensional setting with a bounded operator. If so, I think this result should generalize.
Edit: I just realized you have to modify the formula for the remainder term slightly. To be precise, you may need a different ##\xi## for each component of ##\mathbf{r}_N(t)##, although the theorem still holds.