How can we prove the derivative inequality for f(x)=sin(x)/x?

In summary, Opalg was able to solve a different variation of the challenge than Rock. Both solutions are very neat!
  • #1
Fallen Angel
202
0
Hi,

My first challenge was not very popular so I bring you another one.

Let us define \(\displaystyle f(x)=\dfrac{sin(x)}{x}\) for \(\displaystyle x>0\).

Prove that for every \(\displaystyle n\in \mathbb{N}\), \(\displaystyle |f^{(n)}(x)|<\dfrac{1}{n+1}\) where \(\displaystyle f^{n}(x)\) denotes the n-th derivative of \(\displaystyle f\)
 
Mathematics news on Phys.org
  • #2
Fallen Angel said:
Hi,

My first challenge was not very popular so I bring you another one.

Let us define \(\displaystyle f(x)=\dfrac{\sin(x)}{x}\) for \(\displaystyle x>0\).

Prove that for every \(\displaystyle n\in \mathbb{N}\), \(\displaystyle \bigl|\,f^{(n)}(x)\bigr| < \dfrac{1}{n+1}\) where \(\displaystyle f^{n}(x)\) denotes the n-th derivative of \(\displaystyle f\)
[sp]The $n$th derivative of $\dfrac{\sin x}x = x^{-1}\sin x$ is given by Leibniz's formula as \(\displaystyle \sum_{k=0}^n {n\choose k} \frac{d^k}{dx^k}(\sin x)\frac{d^{n-k}}{dx^{n-k}}(x^{-1}).\)

The derivatives of $\sin x$ oscillate between $\pm\sin x$ and $\pm\cos x$. I found it most conveniant to regard $\sin x$ as the imaginary part of $e^{ix}$, so that its $k$th derivative can be written as $\mathrm{Im}\bigl(i^ke^{ix}\bigr)$. The $k$th derivative of $x^{-1}$ is $(-1)^kk!x^{-k-1}.$ So the Leibniz formula becomes $$\frac{d^n}{dx^n}\Bigl(\frac{\sin x}x\Bigr) = \sum_{k=0}^n {n\choose k} \mathrm{Im}\bigl(i^ke^{ix}\bigr) (-1)^{n-k}(n-k)!x^{-n+k-1}.$$ We want to show that the absolute value of this is less than $\dfrac1{n+1}$. After multiplying both sides by $x^{n+1}$ that becomes $$\left| \sum_{k=0}^n {n\choose k} \mathrm{Im}\bigl(i^ke^{ix}\bigr) (-1)^{n-k}(n-k)!x^k \right| <\frac{x^{n+1}}{n+1}.\qquad(*)$$ So we want to show that (*) holds for all $x>0$. Notice that both sides of (*) are zero when $x=0$. The idea will be to differentiate the expression on the left of (*) and show that this is less than the derivative of the right side. That will show that the left side of (*) is always less than the right side. By the product rule, $$\frac d{dx} \sum_{k=0}^n {n\choose k} \mathrm{Im}\bigl(i^ke^{ix}\bigr) (-1)^{n-k}(n-k)!x^k = \sum_{k=0}^n \frac{n!}{k!}(-1)^{n-k}\Bigl( \mathrm{Im}\bigl(i^{k+1}e^{ix}\bigr)x^k + \mathrm{Im}\bigl(i^ke^{ix}\bigr)kx^{k-1} \Bigr).\qquad(**)$$ (When $k=0$ the second term in the big parentheses becomes $0$.) In fact, the right side of (**) is a telescoping sum. For each power of $x$ apart from $x^0$ and $x^n$, the coefficient in the second term in the big parentheses is the negative of the coefficient of the same power of $x$ occurring in the first term in the parentheses. So the only term that survives is the term in $x^n$, namely $\mathrm{Im}\bigl(i^{n+1}e^{ix}\bigr)x^n$. But that is less in absolute value than $ \dfrac d{dx}\Bigl(\dfrac{x^{n+1}}{n+1}\Bigr) = x^n$, just as we wanted.

(Strictly speaking, the absolute value of $\mathrm{Im}\bigl(i^{n+1}e^{ix}\bigr)x^n$ is actually equal to $x^n$ when $x$ is equal to some multiples of $\pi/2$. But that does not affect the argument.)[/sp]
 
  • #3
Good work Opalg!

My solution is slightly different.

Let us denote $f(x)=\dfrac{sin(x)}{x}=\displaystyle\int_{0}^{1}cos(xt)dt$.
Then $f^{(n)}(x)=\displaystyle\int_{0}^{1}\dfrac{\partial ^{n}}{\partial x^{n}}cos(xt)dt=\displaystyle\int_{0}^{1}t^{n}g_{n}(xt)dt $

Where $g_{n}(xt)=\pm cos(xt), \pm sin(xt)$, but $|g_{n}(xt)|\leq 1$ and the equality holds only for finitely many points. Hence

$|f^{n}(x)|\leq \displaystyle\int_{0}^{1}t^{n}|g_{n}(xt)|dt<\displaystyle\int_{0}^{1}t^{n}dt=\dfrac{1}{n+1}$
 
  • #4
Fallen Angel said:
Good work Opalg!

My solution is slightly different.

Let us denote $f(x)=\dfrac{sin(x)}{x}=\displaystyle\int_{0}^{1}cos(xt)dt$.
Then $f^{(n)}(x)=\displaystyle\int_{0}^{1}\dfrac{\partial ^{n}}{\partial x^{n}}cos(xt)dt=\displaystyle\int_{0}^{1}t^{n}g_{n}(xt)dt $

Where $g_{n}(xt)=\pm cos(xt), \pm sin(xt)$, but $|g_{n}(xt)|\leq 1$ and the equality holds only for finitely many points. Hence

$|f^{n}(x)|\leq \displaystyle\int_{0}^{1}t^{n}|g_{n}(xt)|dt<\displaystyle\int_{0}^{1}t^{n}dt=\dfrac{1}{n+1}$
Very neat! (Rock)
 
  • #5
(x).

As a scientist, one way to prove this derivative inequality is by using mathematical induction. First, we can start by proving the base case for n=1, which is simply showing that |f'(x)|<\dfrac{1}{2}. We can do this by using the quotient rule for derivatives and simplifying the expression.

Next, we assume that the inequality holds true for some arbitrary positive integer k, meaning |f^{(k)}(x)|<\dfrac{1}{k+1}. Then, we can use the chain rule to find the (k+1)-th derivative of f(x) and show that it is also less than \dfrac{1}{(k+1)+1}=\dfrac{1}{k+2}. This completes the inductive step and proves the inequality for all natural numbers n.

Alternatively, we can also use Taylor's theorem to prove this derivative inequality. By expanding f(x) in a Taylor series centered at x=0, we can show that f^{(n)}(x)=\dfrac{(-1)^n}{(n+1)!}\sin(x)+\dfrac{(-1)^{n+1}}{(n+3)!}x\sin(x)+\cdots, which is clearly bounded by \dfrac{1}{n+1} for all n\in\mathbb{N}. This also proves the derivative inequality for f(x).

In conclusion, there are multiple ways to prove the derivative inequality for f(x)=\dfrac{\sin(x)}{x}, and it is important to choose the most appropriate method depending on the context and problem at hand.
 

FAQ: How can we prove the derivative inequality for f(x)=sin(x)/x?

What is a derivative inequality?

A derivative inequality is a mathematical statement that compares the rates of change of two functions. It is commonly used to determine the maximum or minimum values of a function.

How is a derivative inequality different from a regular inequality?

A derivative inequality involves the derivatives of functions, while a regular inequality simply compares two quantities. In a derivative inequality, the rates of change of the functions are compared, rather than the functions themselves.

What is the purpose of using derivative inequalities?

The main purpose of using derivative inequalities is to find the maximum or minimum values of a function. They are also useful in optimization problems, where the goal is to find the maximum or minimum value of a function under certain constraints.

What are some common techniques for solving derivative inequalities?

Some common techniques for solving derivative inequalities include finding critical points of the functions, using the first or second derivative test, and using algebraic manipulation to simplify the inequality.

Are there any real-world applications of derivative inequalities?

Yes, derivative inequalities have many real-world applications. For example, they can be used in economics to determine the maximum profit or minimum cost for a business, and in physics to find the maximum or minimum velocity of an object under certain constraints.

Similar threads

Replies
1
Views
930
Replies
2
Views
1K
Replies
3
Views
1K
Replies
1
Views
857
Replies
2
Views
985
Back
Top