Can the Origin be a Critical Point for a Function in an Interval?

  • Thread starter 0kelvin
  • Start date
In summary, the convolution y(x) = sin(x-t)f(t) - sin(x-0)f(0) is not differentiable at the boundaries where t varies. However, using the Leibniz rule and eliminating what disturbs, it can be shown that y'(x) = sin(x-t)f(t) and y''(x) = sin(x-0)f(0).
  • #1
0kelvin
50
5
Homework Statement
Let f be a continuous function in a interval I containing the origin and let

##y = y(x) = \int_0^x sin(x - t)f(t) dt##

Prove that ##y'' + y = f(x)## and ##y(0) = y'(0) = 0## for all x ##\in I##
Relevant Equations
...
I know how to solve ##\frac{d}{dx} \int_0^{x^2} sin(t^2) dt## and from the statement I got that f(0) = 0 because f contains the origin and is continuous.

I tried y'(x) = sin(x - x)f(x) - sin(x - 0)f(0) but that doesn't seem to look good.
 
Physics news on Phys.org
  • #2
You are not applying Leibniz's integral rule correctly.
https://en.wikipedia.org/wiki/Leibniz_integral_ruleAlso I don't understand how you got that f(0)=0 because f is continuous and contains the origin..

Just apply correctly Leibniz's integral rule to calculate y'(x) and y''(x)..
 
  • #3
We have a function ##g(x,t)=\sin(x-t)f(t)## as integrand, so that ##y(x)=\int_0^x g(x,t)\,dt##.
I would like to differentiate by ##\dfrac{d}{dx}g(x)= \displaystyle{ \int_0^x } \dfrac{\partial }{\partial x}g(x,t)\,dt## but I'm not 100% sure whether this is ok if the variable is still in the boundary.

What does your book say about differentiating convolutions?
Have a look: https://en.wikipedia.org/wiki/Convolution#Differentiation
 
  • #4
fresh_42 said:
I'm not 100% sure whether this is ok if the variable is still in the boundary.
It is not. The easy counter example is letting g(x,t) be a non-zero constant.
 
  • #5
fresh_42 said:
We have a function ##g(x,t)=\sin(x-t)f(t)## as integrand, so that ##y(x)=\int_0^x g(x,t)\,dt##.
I would like to differentiate by ##\dfrac{d}{dx}g(x)= \displaystyle{ \int_0^x } \dfrac{\partial }{\partial x}g(x,t)\,dt## but I'm not 100% sure whether this is ok if the variable is still in the boundary.

What does your book say about differentiating convolutions?
Have a look: https://en.wikipedia.org/wiki/Convolution#Differentiation
I am not so sure that the differentiation result for convolution can be applied when the boundaries of the convolution are functions of x.
 
  • #6
Delta2 said:
I am not so sure that the differentiation result for convolution can be applied when the boundaries of the convolution are functions of x.
It cannot.
 
  • Like
Likes Delta2
  • #7
Here is an idea, derived from the principle: eliminate what disturbs!
I'm not sure whether my first step is necessary, but it is what I did. We have
\begin{align*}
y(x)&=\int_0^x \sin(x-t)f(t)\,dt\\
&=-\int_0^x \sin(t-x)f(t)\,dt\\
&\stackrel{(sx=t-x)}{=} -\int_{-1}^0 \sin(sx)\left( xf(sx+x) \right)\,ds \\
&\stackrel{(g(sx)=xf(sx+x))}{=} -\int_{-1}^0 \sin(sx) g(sx) \, ds
\end{align*}
where now ##x## is just a constant and the integral basically the same as before, with constant boundary and a modified function ##g(tx):=xf(tx+x)## which is still defined on now ##[-1,0]## and continuous.
 
  • #8
You can of course do something like that, but why would you? It is a simple matter of correctly applying what was linked to already in #2.
 
  • #9
Yes well Leibniz rule correctly applied gives the answer in three lines, but it is interesting what @fresh_42 does because it removes the dependence of the boundary on x.
 
  • #10
I start to like the question, because how to deal with parameter integrals as well as this useful principle to attack what disturbs can be learnt. I assume that the proof of the Leibniz rule does exactly this. But one has to be cautious with Leibniz as ##f(t) ## is not necessarily differentiable.
 
  • #11
While that would be workable, I think it is more common to just apply the definition of the derivative.
 
  • #12
fresh_42 said:
I start to like the question, because how to deal with parameter integrals as well as this useful principle to attack what disturbs can be learnt. I assume that the proof of the Leibniz rule does exactly this. But one has to be cautious with Leibniz as ##f(t) ## is not necessarily differentiable.
I don't think we need differentiation with respect to t, the statement of the theorem speaks about continuity with respect to x and t , and differentiation with respect to x.
 
  • #13
I haven't checked, that's why I said cautious. Anyway, there is something to learn from the exercise.
 
  • #14
I just realized that I was confusing differentiation with integration.

$$\int_0^x sin(t) dt = -cos(x) + 1$$

but

$$\frac{d}{dx} \int_0^x sin(t) dt = sin(x)$$

not ##F'(x) - F'(0) = sin(x) - sin(0)##
 
  • #15
Following wikipedia's article I was able to prove it. I just realized something else, in addition. The statement is that the interval contains the origin, not that the function itself contains the point (0,0).
 

FAQ: Can the Origin be a Critical Point for a Function in an Interval?

What is the meaning of the equation "y'' + y = f(x)"?

The equation "y'' + y = f(x)" is a second-order linear homogeneous differential equation where y is the dependent variable, x is the independent variable, and f(x) is a given function of x.

What does it mean to "prove" this equation?

To prove the equation "y'' + y = f(x)", we need to show that it satisfies the properties of a second-order linear homogeneous differential equation, such as having a unique solution and satisfying the initial or boundary conditions.

Why is this equation important in science?

This equation is important in science because it is used to model many physical phenomena, such as oscillations, vibrations, and electrical circuits. It also has applications in engineering, economics, and biology.

How do you solve this equation?

To solve the equation "y'' + y = f(x)", we can use methods such as separation of variables, variation of parameters, or the Laplace transform. The specific method used will depend on the given function f(x) and any initial or boundary conditions.

What are some real-world examples of this equation?

Some real-world examples of the equation "y'' + y = f(x)" include the motion of a mass on a spring, the movement of a pendulum, and the behavior of an electrical circuit with a resistor and capacitor in series.

Back
Top