How is Equation (1) Equivalent to the Derivative Definition in Theorem 7.1?

In summary, the conversation discusses the proof of Theorem 7.1 in the book "Theory of Functions of a Complex Variable" by A. I. Markushevich. The conversation focuses on the definition of the derivative of f(z) and the relationship between two equations given by Markushevich. The conversation concludes with a proposed method for formally and rigorously deriving one equation from the other.
  • #1
Math Amateur
Gold Member
MHB
3,998
48
I am reading the book: "Theory of Functions of a Complex Variable" by A. I. Markushevich (Part 1) ...

I need some help with an aspect of the proof of Theorem 7.1 ...The statement of Theorem 7.1 reads as follows:
View attachment 9330At the start of the above proof by Markushevich we read the following:

"If \(\displaystyle f(z)\) has a derivative \(\displaystyle f'_E(z_0)\) at \(\displaystyle z_0\), then by definition

\(\displaystyle \frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )\)

where \(\displaystyle \epsilon ( z, z_0 ) \to 0\) as \(\displaystyle \Delta z \to 0\). ... ... "Now previously in Equation 7.1 at the start of Chapter 7, Markushevich has defined \(\displaystyle f'_E(z_0)\) as follows:

\(\displaystyle f'_E(z_0) = \frac{ f(z) - f(z_0) }{ z - z_0 } = \frac{ \Delta_E f(z) }{ \Delta z }\) ... ... ... (1)How exactly (formally and rigorously) is equation (1) exactly the same as \(\displaystyle \frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )\) ...... strictly speaking, shouldn't Markushevich be deriving \(\displaystyle \frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )\) ... from equation (1) ... Peter
 

Attachments

  • Markushevich - Theorem 7.1 and Proof ... .png
    Markushevich - Theorem 7.1 and Proof ... .png
    22 KB · Views: 122
Physics news on Phys.org
  • #2
Peter said:
I am reading the book: "Theory of Functions of a Complex Variable" by A. I. Markushevich (Part 1) ...
At the start of the above proof by Markushevich we read the following:

"If \(\displaystyle f(z)\) has a derivative \(\displaystyle f'_E(z_0)\) at \(\displaystyle z_0\), then by definition

\(\displaystyle \frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )\)

where \(\displaystyle \epsilon ( z, z_0 ) \to 0\) as \(\displaystyle \Delta z \to 0\). ... ... "Now previously in Equation 7.1 at the start of Chapter 7, Markushevich has defined \(\displaystyle f'_E(z_0)\) as follows:

\(\displaystyle f'_E(z_0) = \frac{ f(z) - f(z_0) }{ z - z_0 } = \frac{ \Delta_E f(z) }{ \Delta z }\) ... ... ... (1)How exactly (formally and rigorously) is equation (1) exactly the same as \(\displaystyle \frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )\) ...... strictly speaking, shouldn't Markushevich be deriving \(\displaystyle \frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )\) ... from equation (1) ... Peter

I think I know what is being said, though the notation is a bit all over the place here. E.g. you wrote
\(\displaystyle f'_E(z_0) = \frac{ f(z) - f(z_0) }{ z - z_0 } = \frac{ \Delta_E f(z) }{ \Delta z }\)
but I didn't see a limit explicitly taken so it doesn't seem like a definition of a derivative that I'm familiar with.

Anyway, as is often the case, why not try to estimate the difference between those two definitions, and use the oh so important triangle inequality.

This gives

$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - \big(f'_E(z_0) + \epsilon ( z, z_0 )\big) \Big \vert \leq \Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert + \Big \vert \epsilon ( z, z_0 ) \Big \vert $

by triangle inequality. Now pass limits
i.e. consider for any $\epsilon \gt 0$ we can select a $\delta_1$ neighborhood (i.e. for all $\big \vert z - z_0 \big \vert \lt \delta_1$) such that
$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert \lt \frac{\epsilon}{2}$

and $\delta_2$ neighborhood such that
$ \Big \vert \epsilon ( z, z_0 ) \Big \vert \lt \frac{\epsilon}{2}$

and select $\delta = \min \big(\delta_1, \delta_2\big)$ so you have

$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - \big(f'_E(z_0) + \epsilon ( z, z_0 )\big) \Big \vert \leq \Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert + \Big \vert \epsilon ( z, z_0 ) \Big \vert \lt \frac{\epsilon}{2} + \frac{\epsilon}{2} = \epsilon$
 
  • #3
steep said:
I think I know what is being said, though the notation is a bit all over the place here. E.g. you wrote
\(\displaystyle f'_E(z_0) = \frac{ f(z) - f(z_0) }{ z - z_0 } = \frac{ \Delta_E f(z) }{ \Delta z }\)
but I didn't see a limit explicitly taken so it doesn't seem like a definition of a derivative that I'm familiar with.

Anyway, as is often the case, why not try to estimate the difference between those two definitions, and use the oh so important triangle inequality.

This gives

$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - \big(f'_E(z_0) + \epsilon ( z, z_0 )\big) \Big \vert \leq \Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert + \Big \vert \epsilon ( z, z_0 ) \Big \vert $

by triangle inequality. Now pass limits
i.e. consider for any $\epsilon \gt 0$ we can select a $\delta_1$ neighborhood (i.e. for all $\big \vert z - z_0 \big \vert \lt \delta_1$) such that
$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert \lt \frac{\epsilon}{2}$

and $\delta_2$ neighborhood such that
$ \Big \vert \epsilon ( z, z_0 ) \Big \vert \lt \frac{\epsilon}{2}$

and select $\delta = \min \big(\delta_1, \delta_2\big)$ so you have

$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - \big(f'_E(z_0) + \epsilon ( z, z_0 )\big) \Big \vert \leq \Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert + \Big \vert \epsilon ( z, z_0 ) \Big \vert \lt \frac{\epsilon}{2} + \frac{\epsilon}{2} = \epsilon$
Hi steep...

Thanks so much for your post ...

I am still reflecting on what you have written ...I must apologize for a serious typo in equation (1) ...

I wrote ...\(\displaystyle f'_E(z_0) = \frac{ f(z) - f(z_0) }{ z - z_0 } = \frac{ \Delta_E f(z) }{ \Delta z }\) ... ... ... (1)when I should have written \(\displaystyle f'_E(z_0) = \lim_{ z \to z_0} \frac{ f(z) - f(z_0) }{ z - z_0 } = \lim_{ \Delta z \to 0} \frac{ \Delta_E f(z) }{ \Delta z }\) ... ... ... (1)I should have also posted the beginnings of Markushevich's start to Chapter 7 to give readers access o his definitions ... so I am posting that now ... as follows:View attachment 9332
View attachment 9333
Hope that helps ...

Peter
 

Attachments

  • Markushevich - 1 - Start of Ch. 7, Section 28 ... PART 1 .png
    Markushevich - 1 - Start of Ch. 7, Section 28 ... PART 1 .png
    25.4 KB · Views: 93
  • Markushevich - 2 - Start of Ch. 7, Section 28 ... PART 2 ... .png
    Markushevich - 2 - Start of Ch. 7, Section 28 ... PART 2 ... .png
    17.5 KB · Views: 112
  • #4
btw it seems worth pointing out that the (limiting) difference quotient is in some sense the 'original' derivative definition for single variable calc.

However the second definition introduced here is, in effect, that the derivative (if it exists) is the best linear approximation of a function over a sufficiently small neighborhood, period. This definition / interpretation is one that generalizes to higher dimensions. And since complex analysis is one variable analysis, but is 'kind of like' multivariable analysis ($\mathbb R^2$ looms), any perceived difference between the definitions is a good thing to dwell on.

Another thing-- a long overdue release / update to Beardon's "Complex Analysis: The Argument Principle" is coming next month as a Dover book. I think you may have a bigger math library than me but I thought I'd mention it.
 
  • #5
steep said:
btw it seems worth pointing out that the (limiting) difference quotient is in some sense the 'original' derivative definition for single variable calc.

However the second definition introduced here is, in effect, that the derivative (if it exists) is the best linear approximation of a function over a sufficiently small neighborhood, period. This definition / interpretation is one that generalizes to higher dimensions. And since complex analysis is one variable analysis, but is 'kind of like' multivariable analysis ($\mathbb R^2$ looms), any perceived difference between the definitions is a good thing to dwell on.

Another thing-- a long overdue release / update to Beardon's "Complex Analysis: The Argument Principle" is coming next month as a Dover book. I think you may have a bigger math library than me but I thought I'd mention it.

Thanks for your most helpful posts, steep ...

I'll definitely keep a watch out for the release of Beardon's book ... I find being able consult a number of texts treatment of mathematical topics is helpful to learning ...

Thanks again ...

Peter
 

FAQ: How is Equation (1) Equivalent to the Derivative Definition in Theorem 7.1?

What is the definition of differentiability for functions of a complex variable?

The definition of differentiability for functions of a complex variable is similar to the definition for real-valued functions. A function f(z) is said to be differentiable at a point z0 if the limit (f(z) - f(z0))/(z - z0) exists as z approaches z0. This limit is known as the derivative of f(z) at z0 and is denoted by f'(z0).

What is the difference between differentiability and continuity for functions of a complex variable?

Continuity is a necessary but not sufficient condition for differentiability. A function can be continuous at a point but not differentiable, while a function must be differentiable at a point in order to be continuous at that point.

What is the Cauchy-Riemann equations and how does it relate to the differentiability of functions of a complex variable?

The Cauchy-Riemann equations are a set of necessary conditions for the differentiability of a function of a complex variable. They state that if a function f(z) is differentiable at a point z0, then it must satisfy the equations Re(f'(z0)) = ux = vy and Im(f'(z0)) = uy = -vx, where u(x,y) and v(x,y) are the real and imaginary parts of f(z) respectively.

What is Theorem 7.1 in Markushevich's book and why is it important in the study of differentiability of functions of a complex variable?

Theorem 7.1 in Markushevich's book states that if a function of a complex variable is analytic in a domain D, then it is infinitely differentiable in D. This theorem is important because it shows that the concept of differentiability for functions of a complex variable is closely related to the concept of analyticity, and that analytic functions are very well-behaved in terms of their differentiability properties.

Can a function of a complex variable be differentiable at a point but not analytic in a domain?

Yes, it is possible for a function to be differentiable at a point but not analytic in a domain. This can occur when the function satisfies the Cauchy-Riemann equations at a point, but does not have a power series expansion around that point. In this case, the function is said to have a removable singularity at that point.

Back
Top