What is the significance of the global maximum in differentiable functions?

In summary, we are trying to show that for a differentiable and continuous function $f:[a,b]\rightarrow \mathbb{R}$ with a global maximum at $x^{\star}$, the following holds: $$f'(x^{\star})=\left\{\begin{matrix}=0\\ \leq 0\\ \geq 0\end{matrix}\right. , \ \text{ if } \ \left\{\begin{matrix}x^{\star} \in (a,b)\\ x^{\star}=a\\ x^{\star}=b\end{matrix}\right.$$ To prove this, we use the definition of the derivative at $
  • #1
mathmari
Gold Member
MHB
5,049
7
Hey! :eek:

I am looking at the following:

The differentiable and so also continuous function $f:[a,b]\rightarrow \mathbb{R}$ gets its global maximum at a point $x^{\star}$. Show that the following holds $$f'(x^{\star})=\left\{\begin{matrix}
=0\\
\leq 0\\
\geq 0
\end{matrix}\right. , \ \text{ if } \ \left\{\begin{matrix}
x^{\star} \in (a,b)\\
x^{\star}=a\\
x^{\star}=b
\end{matrix}\right.$$

Could you give me a hint what we are supposed to do? (Wondering)

I have done the following:

Since the function has its global maximum at $x^{\star}$ we have that $f(x^{\star}+ \epsilon) \le f(x^{\star}), \ \forall \epsilon$.

We use the definition of the derivative at $x_0$ : \begin{equation*}f'(x_0)= \lim_{h \rightarrow 0} \frac{f(x_0+h)-f(x_0)}{h}\end{equation*}
For $x^{\star}=a$ we have the following:

Let $h>0$. We have the interval $[a,b]$. $a+h$, and so $f(a+h)$ is well defined for $h>0$.

Then we have: \begin{align*}&f'(a)= \lim_{h \rightarrow 0^-} \frac{f(a+h)-f(a)}{h}\leq \lim_{h \rightarrow 0^-} \frac{f(a)-f(a)}{h}=\lim_{h \rightarrow 0^-} \frac{0}{h}=0 \\ & \Rightarrow f'(a)\leq 0\end{align*}

For $x^{\star}=b$ we have the following:

Let$h<0$. We have the interval $[a,b]$. $b+h$, and so $f(b+h)$ is well defined for $h<0$. So, it holds that $h=-m, m>0$.

Then we have: \begin{align*}&f'(b)= \lim_{h \rightarrow 0^+} \frac{f(b+h)-f(b)}{h}=\lim_{m \rightarrow 0^+} \frac{f(b-m)-f(b)}{-m}=\lim_{m \rightarrow 0^+} \frac{-f(b-m)+f(b)}{m}\geq \lim_{m \rightarrow 0^+} \frac{-f(b)+f(b)}{m}=\lim_{m \rightarrow 0^+} \frac{0}{m}=0 \\ & \Rightarrow f'(b)\leq 0\end{align*}
For $x^{\star}\in (a,b)$ we have the following:

\begin{align*}&0=-\lim_{m \rightarrow 0^+} \frac{0}{m} < \lim_{m \rightarrow 0^+} \frac{f(x^{\star}-m)-f(x^{\star})}{-m} = f'(x^{\star})=\lim_{h \rightarrow 0^-} \frac{f(x^{\star} +h)-f(x^{\star})}{h}<\lim_{h \rightarrow 0^-} \frac{0}{h}=0 \\ & \Rightarrow f'(x^{\star})=0\end{align*}
Is everything correct? Could I improve something? Are the one-sided limits correct? (Wondering)
 
Physics news on Phys.org
  • #2


Hi there! It looks like you're on the right track. To show that $f'(x^{\star})=0$, you can use the fact that $f$ is differentiable at $x^{\star}$ and the definition of the derivative. Specifically, you can show that the one-sided limits of the derivative at $x^{\star}$ are both equal to 0, which would imply that $f'(x^{\star})=0$. This would cover all three cases: $x^{\star}\in (a,b)$, $x^{\star}=a$, and $x^{\star}=b$.

One thing to note is that in your proof for $x^{\star}=b$, you have a slight error in your limit calculation. It should be:

\begin{align*}
f'(b) &= \lim_{h \rightarrow 0^+} \frac{f(b+h)-f(b)}{h}\\
&= \lim_{m \rightarrow 0^+} \frac{f(b+m)-f(b)}{m}\\
&= \lim_{m \rightarrow 0^+} \frac{f(b)-f(b+m)}{m}\\
&= \lim_{m \rightarrow 0^+} \frac{0}{m}\\
&= 0
\end{align*}

Other than that, everything looks good! Keep up the good work.
 

Similar threads

Back
Top