Finding a periodic solution to ODE

In summary, finding a periodic solution to an ordinary differential equation (ODE) involves identifying functions that repeat their values at regular intervals. This process typically requires analyzing the stability and behavior of the system described by the ODE. Techniques such as the Poincaré-Bendixson theorem, Floquet theory, and numerical simulations may be employed to determine the existence and characteristics of periodic solutions. These solutions are crucial in various applications, including physics, engineering, and biology, where cyclic behaviors are observed.
  • #1
psie
269
32
Homework Statement
Find a periodic solution with a continuous first derivative on ##\mathbb R## of the differential equation ##y''+y'+y=g##, where ##g## has period ##4\pi## and ##g(t)=1## for ##|t|<\pi##, ##g(t)=0## for ##\pi<|t|<2\pi##.
Relevant Equations
Complex Fourier series, complex Fourier coefficients, etc.
My main concern with this exercise is that I do not know how to verify that the solution is ##C^1## on all of ##\mathbb R##. ##g## is certainly discontinuous. I begin by computing its Fourier coefficients. They are $$c_n=\frac{1}{4\pi}\int_{-2\pi}^{2\pi}g(t)e^{-int/2}dt= \frac{1}{4\pi}\int_{-\pi}^{\pi}e^{-int/2}dt.$$ So ##c_0=\frac12## and, using WolframAlpha this time, ##c_n=\frac{\sin\left(\frac{\pi n}{2}\right)}{\pi n}## for ##n\neq 0##. If we assume ##y(t)## can be written as a sum of a Fourier series, with coefficients ##b_n##, the ODE reads $$\sum \left(-\frac{n^2}{4}+ \frac{in}{2}+1\right)b_n e^{int/2} =\sum c_n e^{int/2}.$$ From this we obtain that $$b_n=\frac{c_n}{1+\frac{in}{2}-\frac{n^2}{4}}=\frac{\sin\left(\frac{\pi n}{2}\right)}{\pi n\left(1+\frac{in}{2}-\frac{n^2}{4}\right)}.$$ But why would this function be ##C^1## on all of ##\mathbb R##?
 
Last edited:
Physics news on Phys.org
  • #2
psie said:
Homework Statement: Find a periodic solution with a continuous first derivative on ##\mathbb R## of the differential equation ##y''+y'+y=g##, where ##g## has period ##4\pi## and ##g(t)=1## for ##|t|<\pi##, ##g(t)=0## for ##\pi<|t|<2\pi##.
Relevant Equations: Complex Fourier series, complex Fourier coefficients, etc.

But why would this function be C1 on all of R?
Isn't it some sort of trivial work to prove that the function $$y(t)=\sum b_n e^{in\pi t}$$ has continuous first derivative (because it is the (infinite but countable infinite ) sum of continuous differentiable functions $$y_n(t)=b_ne^{in\pi t}$$
 
  • #3
P.S we could have said the same thing for g(t) ..(that is continuously differentiable).

Hmm it must have something to do with that ##b_n## converges much faster to zero than ##c_n##.
 
  • #4
Delta2 said:
Isn't it some sort of trivial work to prove that the function $$y(t)=\sum b_n e^{in\pi t}$$ has continuous first derivative (because it is the (infinite but countable infinite ) sum of continuous differentiable functions $$y_n(t)=b_ne^{in\pi t}$$
Hmm, maybe something along these lines, but for your statement to be true, we’d probably require uniform convergence of the differentiated series, which is not so trivial to verify I think.

Regarding decay of coefficients, it holds that ##c_n## decays as ##O(n^{-k})## if ##f\in C^k##, but the converse I believe is not true. One could construct an easy counter example with the indicator function.

Maybe I should just see it as an assumption on ##y## for it to have a Fourier series, because if it’s ##C^1##, its Fourier series converges. There is no reason why ##y## should inherit any properties of ##g##.
 
  • #5
psie said:
Regarding decay of coefficients, it holds that cn decays as O(n−k) if f∈Ck, but the converse I believe is not true. One could construct an easy counter example with the indicator function.
Hmmm, maybe there is some extra condition so that the converse is also true?
 
  • #6
Oh damn its very easy and its require only Calculus 1 to show it lol. Since we are given that the second derivative of y exists.... this means that the first derivative is continuous.
 
  • #7
Delta2 said:
Oh damn its very easy and its require only Calculus 1 to show it lol. Since we are given that the second derivative of y exists.... this means that the first derivative is continuous.
Yeah, maybe this does it. But I’m thinking the ODE only says what happens at the points of continuity of ##g##. It doesn’t say what happens at ##\pi## for instance. We could have that ##y## is also discontinuous at ##\pi##, which makes it not ##C^1## on all of ##\mathbb R##.
 
  • #8
psie said:
Yeah, maybe this does it. But I’m thinking the ODE only says what happens at the points of continuity of ##g##. It doesn’t say what happens at ##\pi## for instance. We could have that ##y## is also discontinuous at ##\pi##, which makes it not ##C^1## on all of ##\mathbb R##.
yes I thought of that complication too about discontinuity of y at specific points. Given that you have find the correct formula for ##c_n## and ##b_n## (something doesnt look quite right to me but lets leave that for the moment), can we prove that $$y(t)=\sum b_n e^{int/2}$$ is two times differentiable everywhere at R (for every ##t\in\mathbb{R}##?)

Seems to me we ll have to check if the series $$y''(t)=\sum \frac{n^2}{4}b_n e^{int/2}$$ converges for every t in R.
 
Last edited:
  • Like
Likes psie
  • #9
You know from general theory of linear ODEs that the solution is a linear combination of [itex]e^{\omega t}[/itex] and [itex]e^{\bar{\omega}t}[/itex] in [itex]\pi < |t| < 2\pi[/itex] and 1 plus a linear combination of those in [itex]|t| < \pi[/itex] where [itex]\omega^2 + \omega + 1 = 0[/itex] so that [itex]\omega = e^{2\pi i/3}[/itex]. The coefficients of [itex]e^{\omega t}[/itex] annd [itex]e^{\bar{\omega}t}[/itex] in the three regions [itex]-2\pi < t < -\pi[/itex], [itex]-\pi < t < \pi[/itex] and [itex]\pi < t < 2\pi[/itex] are fixed by the periodicity and continuity requirements on [itex]y[/itex] and its derivative - six conditions in all.

You could, in principle, find those coefficients, compute the fourier series of that solution and show that they are identical to those found by transforming the equation, but I would hope that there is an easier way.

Note that [tex]1 + \frac{in}2 - \frac{n^2}4 = \left(\omega - \frac{in}2\right)\left(\bar{\omega} - \frac{in}{2}\right).[/tex]

EDIT: The condition you are looking for (which is Theorem 9.4 of Körner's Fourier Analysis) is that if [tex]
\sum_{n=-\infty}^\infty |n||c_n|[/tex] converges then [itex]\sum_{n=-\infty}^\infty c_ne^{int}[/itex] is once continuously differentiable, and [itex]\sum_{n=-N}^N inc_ne^{int}[/itex] converges uniformly to its derivative. That is the case here, since [itex]|nb_n|[/itex] decreases as [itex]|n|^{-2}[/itex] for large [itex]|n|[/itex].
 
Last edited:
  • Like
Likes BvU and psie
  • #10
Delta2 said:
Isn't it some sort of trivial work to prove that the function $$y(t)=\sum b_n e^{in\pi t}$$ has continuous first derivative (because it is the (infinite but countable infinite ) sum of continuous differentiable functions $$y_n(t)=b_ne^{in\pi t}$$
Still, given an infinite sum, there may be some kinks to figure out as to what happens in the limit.
 
  • Like
Likes Delta2
  • #11
pasmith said:
EDIT: The condition you are looking for (which is Theorem 9.4 of Körner's Fourier Analysis) is that if [tex]
\sum_{n=-\infty}^\infty |n||c_n|[/tex] converges then [itex]\sum_{n=-\infty}^\infty c_ne^{int}[/itex] is once continuously differentiable, and [itex]\sum_{n=-N}^N inc_ne^{int}[/itex] converges uniformly to its derivative. That is the case here, since [itex]|nb_n|[/itex] decreases as [itex]|n|^{-2}[/itex] for large [itex]|n|[/itex].
I've been looking into Körner's book. The theorem reads:
Theorem 9.4 Let ##f:\mathbb T\to\mathbb C## be continuous. Then if ##\sum_{n=-\infty}^n |n||c_n|## converges, it follows that ##f## is once continuously differentiable and that ##\sum_{n=-N}^N in c_ne^{int}\to f'(t)## uniformly as ##N\to\infty##.
I think the continuity requirement on ##f## is crucial here. In the exercise above, we don't know if ##y## is continuous or piecewise continuous. In case of the latter, I don't think this theorem applies.
 
  • #12
psie said:
we don't know if y is continuous or piecewise continuous.
You got to give us the full word by word statement of the problem but by the way it is already given it implies that the equation y''+y'+y=g holds everywhere, hence the first and second derivative exist everywhere in R, hence the first derivative of y (and y) are continuous.
 
  • #13
The statement given in the book is:
Find, in the guise of a "complex" Fourier series, a periodic solution with a continuous first derivative on ##\mathbb R## of the differential equation ##y''+y'+y=g##, where ##g## has period ##4\pi## and ##g(t)=1## for ##|t|<\pi##, ##g(t)=0## for ##\pi<|t|<2\pi##.
Delta2 said:
the equation y''+y'+y=g holds everywhere
The equation holds everywhere where ##g## is defined, which is not all of ##\mathbb R##. Hence one can only speculate whether or not ##y## is continuous or piecewise continuous. In case it is piecewise continuous, I don't believe @pasmith suggestion about theorem 9.4 applies.
 
  • #14
psie said:
The statement given in the book is:The equation holds everywhere where ##g## is defined, which is not all of ##\mathbb R##. Hence one can only speculate whether or not ##y## is continuous or piecewise continuous. In case it is piecewise continuous, I don't believe @pasmith suggestion about theorem 9.4 applies.
I think we sort of getting drowned in a spoon of water. Anyway in my opinion the theorem given by @pasmith doesn't presuppose a function f whose fourier coefficients are ##c_n##. It just presupposes that ##c_n## is a sequence of complex numbers and if that series ##\sum |n||c_n|## converges, then the fourier series with ##c_n## as coefficients is once continuously differentiable.
 
  • #15
Delta2 said:
You got to give us the full word by word statement of the problem but by the way it is already given it implies that the equation y''+y'+y=g holds everywhere, hence the first and second derivative exist everywhere in R, hence the first derivative of y (and y) are continuous.
Still, don't know if they're considered here, but you also have weak/ distributional solutions too.https://en.m.wikipedia.org/wiki/Weak_solution
 
Last edited:
  • #16
psie said:
The statement given in the book is:The equation holds everywhere where ##g## is defined, which is not all of ##\mathbb R##. Hence one can only speculate whether or not ##y## is continuous or piecewise continuous. In case it is piecewise continuous, I don't believe @pasmith suggestion about theorem 9.4 applies.

[itex]C_1[/itex] implies continuous; a function cannot be differentiable, let alone continuously so, unless it is continuous.
 
  • Like
Likes psie
  • #17
I'm asking myself: as far as continously differentiable goes, is the response of a second order system to a square wave fundamentally different from the response to a single step function ? (for which there are explicit solutions that can be checked more easily)

##\ ##
 

FAQ: Finding a periodic solution to ODE

What is a periodic solution in the context of ordinary differential equations (ODEs)?

A periodic solution to an ordinary differential equation (ODE) is a solution that repeats itself at regular intervals over time. Mathematically, a function \( y(t) \) is periodic with period \( T \) if \( y(t + T) = y(t) \) for all \( t \). This means that the behavior of the system described by the ODE is cyclic and returns to its initial state after a fixed duration \( T \).

How can you determine if an ODE has a periodic solution?

To determine if an ODE has a periodic solution, one can use various analytical and numerical methods. Analytical methods include finding fixed points and analyzing their stability, using Floquet theory for linear periodic systems, and employing Poincaré maps for nonlinear systems. Numerical methods involve simulating the ODE over time to observe if the solution exhibits periodic behavior. Additionally, certain theorems like the Poincaré-Bendixson theorem can be used to prove the existence of periodic orbits in two-dimensional systems.

What are the common techniques to find periodic solutions to nonlinear ODEs?

Common techniques to find periodic solutions to nonlinear ODEs include:1. **Perturbation Methods**: Used for systems that are close to a known periodic solution.2. **Shooting Method**: Involves guessing the initial conditions and iteratively refining them to satisfy the periodicity condition.3. **Harmonic Balance**: Assumes a solution in the form of a Fourier series and balances harmonics to find coefficients.4. **Numerical Continuation**: Tracks periodic solutions as parameters in the system are varied.5. **Poincaré Maps**: Reduces the problem to finding fixed points of a map, which correspond to periodic orbits in the original system.

What role does the stability of a periodic solution play in the analysis of ODEs?

The stability of a periodic solution is crucial in determining the long-term behavior of the system described by the ODE. A stable periodic solution implies that small perturbations or deviations from the periodic orbit will decay over time, and the system will return to the periodic orbit. Conversely, an unstable periodic solution means that small perturbations will grow, causing the system to deviate further from the periodic orbit. Stability analysis helps in understanding the robustness of the periodic behavior and in predicting the response of the system to external disturbances.

Can periodic solutions exist in systems with non-periodic forcing functions?

Periodic solutions can exist in systems with non-periodic forcing functions under certain conditions, but they are less common. Generally, periodic solutions are more easily found in systems with periodic forcing functions. However, in

Similar threads

Back
Top