On the approximate solution obtained through Euler's method

In summary, "On the approximate solution obtained through Euler's method" discusses the numerical technique known as Euler's method for solving ordinary differential equations. It explains how the method approximates solutions by using tangent line segments at discrete points, effectively transforming a continuous problem into a series of linear approximations. The paper highlights the method's simplicity and ease of implementation, while also addressing its limitations, such as potential errors that arise from step size selection and the method's overall accuracy. It concludes with considerations for improving the precision of the approximation, including adaptive step sizing and the use of more advanced methods.
  • #1
psie
264
32
TL;DR Summary
I'm reading about Euler's method to construct approximate solutions to ODEs in Ordinary Differential Equations by Andersson and Böiers. I have questions about properties of the approximate solution.
This is a bit of a longer post. I have tried to be as brief as possible while still being self-contained. My questions probably do not have much to do with ODEs, but this is the context in which they arose. Grateful for any help.

In what follows ##|\cdot|## denotes either the absolute value of a scalar or the Euclidean norm of a vector (denoted in bold). First a definition of what it means to be an approximate solution:
Definition 1. Let ##I## be an interval on the real axis, and ##\Omega## an open set in ##\mathbf R\times\mathbf{R}^n##. Assume that the function ##\pmb{f}:\Omega\to\mathbf{R}^n## is continuous. A continuous function ##\pmb{x}(t),\ t\in I##, is called an ##\varepsilon##-approximate solution of the system ##\pmb{x}'=\pmb{f}(t,\pmb{x})## if ##(t,\pmb{x})\in\Omega## when ##t\in I## and $$\left|\pmb{x}(t'')-\pmb{x}(t')-\int_{t'}^{t''} \pmb{f}(s,\pmb{x}(s))ds\right|\leq \varepsilon|t''-t'|\quad \text{when } t',t''\in I.$$
Next, a theorem (for the sake of brevity, without proof) on an error estimate on the approximate solution:
Theorem 1. Assume that ##\pmb{f}(t,\pmb{x})## is continuous in ##\Omega\subseteq \mathbf{R}\times\mathbf{R}^n## and satisfies the Lipschitz condition $$|\pmb{f}(t,\pmb{x})-\pmb{f}(t,\pmb{y})|\leq L|\pmb{x}-\pmb{y}|, \quad (t,\pmb{x}),(t,\pmb{y})\in\Omega.$$ Let ##\pmb{\tilde{x}}(t)## be an ##\varepsilon##-approximate and ##\pmb{x}(t)## an exact solution of ##\pmb{x}'=\pmb{f}(t,\pmb{x})## in ##\Omega## when ##t\in I##. For an arbitrary point ##t_0## in ##I## we then have $$|\pmb{\tilde{x}}(t)-\pmb{x}|\leq |\pmb{\tilde{x}}(t_0)-\pmb{x}(t_0)|e^{L|t-t_0|}+\frac{\varepsilon}{L}(e^{L|t-t_0|}-1),\quad t\in I.$$
The Euler's method works as follows. Consider the equation ##x'=f(t,x)## and consider an initial value ##(t_0,x(t_0))##. We know the slope of the tangent line through ##(t_0,x(t_0))##. Follow this tangent a bit to the right, to ##t=t_1=t_0+\delta## and repeat the procedure for the point ##(t_1,x(t_1))##. This way, one gets a broken curve of straight line segments resembling the solution.

To make the definition of the broken curve precise, consider the system ##\pmb{x}'=\pmb{f}(t,\pmb{x})## and an initial value ##(t_0,\pmb{x}_0)##. Make a division of the interval ##[t_0,t_0+a]## in equally long subintervals: ##t_0<t_1<\ldots<t_m=t_0+a##, and put ##\delta=t_j-t_{j-1}##. Then define the function ##\pmb{x}_{\delta}## recursively at the step points ##t_j## by \begin{align}
&\pmb{x}_{\delta}(t_0)=\pmb{x}_0, \nonumber \\
&\pmb{x}_{\delta}(t_{j+1})=\pmb{x}_{\delta}(t_j)+(t_{j+1}-t_j)\pmb{f}(t_j,\pmb{x}_{\delta}(t_j)),\quad j=0,1,\ldots,m-1. \tag1
\end{align}
Between the step points the curve of ##\pmb{x}_{\delta}## is supposed to be a straight line, so $$\pmb{x}_{\delta}(t)=\pmb{x}_{\delta}(t_j)+(t-t_j)\pmb{f}(t_j,\pmb{x}_{\delta}(t_j)),\quad t_j\leq t\leq t_{j+1}.\tag2$$ The function ##\pmb{x}_{\delta}## is defined correspondingly in the interval ##[t_0-a,t_0]##.

The following theorem shows ##\pmb{x}_{\delta}## is an ##\varepsilon##-approximate solution to ##\pmb{x}'=\pmb{f}(t,\pmb{x})## under certain conditions on ##\pmb{f}##.
Theorem 2. [Let ##B,L## and ##C## be positive constants.] Assume that
  1. the function ##\pmb{f}(t,\pmb{x})## is continuous in ##\Omega\subseteq\mathbf{R}\times\mathbf{R}^n##, that ##|\pmb{f}(t,\pmb{x})|\leq B## in ##\Omega##, and that ##a## is so small that ##\Lambda(a)\subseteq\Omega##, where ##\Lambda(a)## denotes the double cone $$\Lambda(a)=\{(t,\pmb{x})\in \mathbf{R}\times\mathbf{R}^n;|t-t_0|\leq a, \ |\pmb{x}-\pmb{x}_0|\leq B|t-t_0|\},$$
  2. ##|\pmb{f}(t,\pmb{x})-\pmb{f}(t,\pmb{y})|\leq L|\pmb{x}-\pmb{y}| \qquad \ \ \ (t,\pmb{x}),(t,\pmb{y})\in\Omega##,
  3. ##|\pmb{f}(t',\pmb{x})-\pmb{f}(t'',\pmb{x})|\leq C|t'-t''| \quad (t',\pmb{x}),(t'',\pmb{x})\in\Omega##
Then ##\pmb{x}_{\delta}## is an ##\varepsilon##-approximate solution to the system ##\pmb{x}'=\pmb{f}(t,\pmb{x})## in the interval ##[t_0-a,t_0+a]##, with ##\varepsilon=\delta(C+LB)##.

Proof. All line segments in the definition of ##\pmb{x}_{\delta}## have a slope at most ##B##. Therefor ##(t,\pmb{x}_{\delta}(t))\in\Lambda(a)## when ##t\in[t_0-a,t_0+a]##. Moreover, if ##|t'-t''|\le\delta## then \begin{align} |\pmb{f}(t',\pmb{x}_{\delta}(t'))-\pmb{f}(t'',\pmb{x}_{\delta}(t''))|&\leq |\pmb{f}(t',\pmb{x}_{\delta}(t'))-\pmb{f}(t'',\pmb{x}_{\delta}(t'))| \nonumber \\ &+|\pmb{f}(t'',\pmb{x}_{\delta}(t'))-\pmb{f}(t'',\pmb{x}_{\delta}(t''))| \nonumber \\ &\leq C|t'-t''|+L|\pmb{x}_{\delta}(t')-\pmb{x}_{\delta}(t'')| \nonumber \\ &\leq C|t'-t''|+LB|t'-t''| \nonumber \\ &\leq \delta(C+LB).\tag3 \end{align}
[The first inequality is the triangle inequality, the second follows from 2. and 3. in the assumptions and the third by the fact that each line segments of ##\pmb{x}_{\delta}## have a slope at most ##B##, so ##|\pmb{x}_{\delta}(t')-\pmb{x}_{\delta}(t'')|\leq B|t'-t''|##.] We must prove that $$\left|\pmb{x}_{\delta}(t'')-\pmb{x}_{\delta}(t')-\int_{t'}^{t''} \pmb{f}(s,\pmb{x}_{\delta}(s))ds\right|\leq \delta(C+LB)|t'-t''|.\tag4$$
If ##t'## and ##t''## belong to the same subinterval ##[t_j,t_{j+1}]## then [from ##(2)##] $$\pmb{x}_{\delta}(t'')-\pmb{x}_{\delta}(t')=(t''-t')\pmb{f}(t_j,\pmb{x}_{\delta}(t_j))=\int_{t'}^{t''} \pmb{f}(t_j,\pmb{x}_{\delta}(t_j))ds,$$ and ##(4)## follows from ##(3)##. (In particular, ##(4)## is valid when ##t'=t_j## and ##t''=t_{j+1}##). If ##t'## and ##t''## belong to different subintervals then use ##(4)## on each one of the intervals ##[t',t_{j+1}],[t_{j+1},t_{j+2}],\ldots,[t_{k-1},t_k],[t_k,t'']## and add the results.
Now comes the part I have some questions about. The authors claim that if we drop assumptions 1. and 2. in theorem 2, it is still possible to show that there is a number ##\varepsilon(\delta)##, tending to zero as ##\delta\to 0##, such that ##\pmb{x}_{\delta}## is an ##\varepsilon(\delta)##-approximate solution in the interval ##I(a)=[t_0-a,t_0+a]##. They write:
We know that ##(t,\pmb{x}_{\delta}(t))\in\Lambda(a)## when ##t\in I(a)## and that $$|\pmb{x}_{\delta}(t')-\pmb{x}_{\delta}(t'')|\leq B|t'-t''|\quad t',t''\in I(a).\tag5$$ Put $$\varepsilon(\delta)=\sup\limits_{|t'-t''|\leq\delta}|\pmb{f}(t',\pmb{x}_{\delta}(t'))-\pmb{f}(t'',\pmb{x}_{\delta}(t''))|.$$ The computations in the proof of theorem 2 show that ##\pmb{x}_{\delta}## is an ##\varepsilon(\delta)##-approximate solution. Furthermore, ##\pmb{f}## is uniformly continuous on ##\Lambda(a)##. From ##(5)## it accordingly follows that $$\lim_{\delta\to0}\varepsilon(\delta)=0.\tag6$$
Question 1: I simply do not understand how ##(6)## follow from ##(5)##. How does it?

Question 2: The authors go on to claim that when the IVP ##\pmb{x}'=\pmb{f}(t,\pmb{x}), \pmb{x}(t_0)=\pmb{x}_0## has a solution ##\pmb{x}(t)##, then it follows from ##(6)## and theorem 1, that ##\pmb{x}_{\delta}## converges uniformly to ##\pmb{x}## as ##\delta\to 0##. The definition of uniform convergence I'm used to is ##\lVert f_n-f\rVert\to 0## as ##n\to\infty##, where ##\lVert\cdot\rVert## denotes the sup-norm, but here they are claiming that the sup-norm should tend to ##0## as ##\delta\to 0##. This makes me wonder; what is ##\pmb{x}_{\delta}##? Is it a sequence? If not, what definition of uniform convergence are the authors using?

EDIT: The last quoted passage is just prior to presenting Peano's existence theorem. They note that:
If you only assume that ##\pmb f## is continuous, you can not use [Picard-Lindelöf's theorem]. In fact, you can not even be certain that ##\pmb{x}_{\delta}(t)## converges when ##\delta\to 0##. But what you can prove is that there is a sequence ##\delta_p##, tending to zero as ##p\to\infty##, such that the functions ##\pmb {x}_{\delta_p}(t)## converge uniformly on ##I(a)##.
 
Last edited:
Physics news on Phys.org
  • #2
Question 2: The authors go on to claim that when the IVP ##\pmb{x}'=\pmb{f}(t,\pmb{x}), \pmb{x}(t_0)=\pmb{x}_0## has a solution ##\pmb{x}(t)##, then it follows from ##(6)## and theorem 1, that ##\pmb{x}_{\delta}## converges uniformly to ##\pmb{x}## as ##\delta\to 0##. The definition of uniform convergence I'm used to is ##\lVert f_n-f\rVert\to 0## as ##n\to\infty##, where ##\lVert\cdot\rVert## denotes the sup-norm, but here they are claiming that the sup-norm should tend to ##0## as ##\delta\to 0##. This makes me wonder; what is ##\pmb{x}_{\delta}##? Is it a sequence? If not, what definition of uniform convergence are the authors using?

Let [itex]X[/itex] be a compact space and [itex]f_\alpha : X \to \mathbb{R}^n[/itex] a family of functions defined for [itex]\alpha \in U \subset \mathbb{R}[/itex]. Then [itex]f_\alpha \to f[/itex] uniformly on [itex]X[/itex] as [itex]\alpha \to \alpha_0 \in \bar{U}[/itex] if and only if for every [itex]\epsilon > 0[/itex] there exists [itex]\delta > 0[/itex] such that for all [itex]\alpha \in U[/itex], [tex]|\alpha - \alpha_0| < \delta \quad \Rightarrow\quad
\sup_{x \in X} \|f_\alpha(x) - f(x)\| < \epsilon.[/tex] Note that this has the same relation to the defintiion of uniform convergence of a sequence of functions as the definition of the limit of a function at a point has to the definition of the limit of a sequence.
 
  • Like
Likes psie

Similar threads

Replies
1
Views
1K
Replies
3
Views
2K
Replies
5
Views
1K
Replies
1
Views
2K
Replies
6
Views
1K
Replies
7
Views
1K
Replies
3
Views
2K
Replies
2
Views
2K
Replies
33
Views
5K
Back
Top