Proof of Euler's Formula with No Angle Addition

  • MHB
  • Thread starter mathbalarka
  • Start date
  • Tags
    Formula
In summary, the conversation discusses different approaches to prove Euler's formula $e^{i\theta} = \cos(\theta) + i\sin(\theta)$ without using the angle addition formula for sine and cosine. One approach involves using the formal derivative of $z(t)$ and justifying that it does not involve the angle sum identity. Another approach involves defining the functions $e^{i\theta}$, $\cos\theta$, and $\sin\theta$ by their power series and deriving all other properties, including the angle sum identity, from that. A third approach involves proving $e^{i(\theta+\phi)} = e^{i\theta}e^{i\phi}$ using the Cauchy product theorem and deducing
  • #1
mathbalarka
456
0
Deduce a proof of Euler's formula $e^{i \theta} = \cos(\theta) + i\sin(\theta)$ without using angle addition formula for sine and cosine.

Balarka
.
 
Mathematics news on Phys.org
  • #2
mathbalarka said:
Deduce a proof of Euler's formula $e^{i \theta} = \cos(\theta) + i\sin(\theta)$ without using angle addition formula for sine and cosine.

Balarka
.

Let suppose to have a point moving along the unity circle with constant angular speed $\frac{1}{2\ \pi}$ so that, indicating the point with a complex variable z, is...

$\displaystyle z(t) = \cos t + i\ \sin t\ (1)$

If You derive z(t) respect to t You have...

$\displaystyle z^{\ '} (t) = - \sin t + i\ \cos t = i\ z(t)\ (2)$

Now if we consider the ODE...

$\displaystyle z^{\ '} = i\ z,\ z(0)=1\ (3)$

... its solution is...

$\displaystyle z(t) = e^{i\ t}\ (4)$

Kind regards

$\chi$ $\sigma$
 
  • #3
First things first, thanks for participating, chisigma.

Your initial start was good, but the solution isn't complete. Can you verify step (2) and show that you are not using angle sum identity while computing $z'(t)$?
 
  • #4
mathbalarka said:
First things first, thanks for participating, chisigma.

Your initial start was good, but the solution isn't complete. Can you verify step (2) and show that you are not using angle sum identity while computing $z'(t)$?

Thr formal derivative is...

$\displaystyle \frac{d}{dt}\ z(t)= \frac{d}{dt}\ (\cos t + i\ \sin t) = - \sin t + i\ \cos t = i\ z(t)\ (1)$

Kind regards

$\chi$ $\sigma$
 
  • #5
\(\displaystyle e^{z} = \sum_{n\geq 0}\frac{z^n}{n!}\)

\begin{align}
e^{i \theta } &= \sum_{n\geq 0}\frac{(i\theta)^n}{n!}\\
& = \sum_{n\geq 0}\frac{(-1)^n \theta ^{2n}}{2n!}+i\sum_{n\geq 0}\frac{(-1)^n \theta ^{2n+1}}{(2n+1)!}\\
& = \cos \theta +i\sin \theta
\end{align}
 
  • #6
Yes, but note that you are deriving sine and cosine while deriving $z(t)$, can you do that without the angle-sum identity?

- - - Updated - - -

Thanks for participating Zaid.

I'd ask you the same as chisigma - can you justify your steps doesn't include the angle sum identity in the background?
 
  • #7
In a rigorous approach to analysis, Zaid Alyafey's answer is the only acceptable way. The functions $e^{i\theta}$, $\cos\theta$ and $\sin\theta$ must all be defined by their power series, and all their other properties (including the angle sum identity) derived from that. In that context, Euler's relation becomes totally trivial.
 
  • #8
mathbalarka said:
I'd ask you the same as chisigma - can you justify your steps doesn't include the angle sum identity in the background?

That is a difficult task unless you are claiming that I am using it in a specific step .
 
  • #9
@ Opalg : First, Zaid's answer is not acceptable unless he clarifies as my question asks about deriving it without angle-sum identity. Second, angle-sum identity is usually not derived from Euler's formula, as it makes things ambiguous.

This is really a very deep question of calculus and I will ask everyone to think a little far than an usual analyst. The problem is quite hard to see, but the answer lies in the very basics.

- - - Updated - - -

ZaidAlyafey said:
That is a difficult task unless you are claiming that I am using it in a specific step .

I am claiming it, yes. If you point the error out, though, you can bypass it.
 
  • #10
[1] It is either you are claiming that the Taylor expansion depends on trigonometric identities.
That is impossible because we can define it for arbitrary differentiable functions.

[2] The power expansion \(\displaystyle e^z= \sum_{n\geq 0}\frac{z^n}{n!}\) is not valid unless we use the trig-identities.
The power expansion of \(\displaystyle e^z\) exists and is unique because of the analyticity of the function. Since it CR is satisfied (use z=x+iy) and all partial derivatives exist and are continuous.

[3] You are claiming that \(\displaystyle \lim_{z \to 0}\frac{e^z-1}{z}=1\) is not valid unless we use trig-identities. Claim [3] might work ,though, I will think about it later.
 
  • #11
No, none of the point to you refer to has anything to do with angle-sum identity. [3] can be proved by applying L'Hospital, and doesn't involve angle-sum identity. You miss something vital.
 
  • #12
mathbalarka said:
angle-sum identity is usually not derived from Euler's formula, as it makes things ambiguous.
I was not saying that the angle-sum identity is derived from Euler's formula. It is proved by multiplying the defining power series, using the theorem that the Cauchy product of two absolutely convergent series converges absolutely to the product of their sums. From that, one can deduce that $e^{i(\theta+\phi)} = e^{i\theta}e^{i\phi}$. The angle-sum identities then follow by taking the real and imaginary parts.
 
  • #13
Opalg said:
I was not saying that the angle-sum identity is derived from Euler's formula. It is proved by multiplying the defining power series, using the theorem that the Cauchy product of two absolutely convergent series converges absolutely to the product of their sums. From that, one can deduce that $e^{i(\theta+\phi)} = e^{i\theta}e^{i\phi}$. The angle-sum identities then follow by taking the real and imaginary parts.

That depends on how you derive the Euler's formula. If you find out a way to derive it without angle-sum formula, then yes, the above can be applied. Otherwise ambiguity occurs, as I have indicated before.
 
  • #14
mathbalarka said:
Opalg said:
I was not saying that the angle-sum identity is derived from Euler's formula. It is proved by multiplying the defining power series, using the theorem that the Cauchy product of two absolutely convergent series converges absolutely to the product of their sums. From that, one can deduce that $e^{i(\theta+\phi)} = e^{i\theta}e^{i\phi}$. The angle-sum identities then follow by taking the real and imaginary parts.

That depends on how you derive the Euler's formula. If you find out a way to derive it without angle-sum formula, then yes, the above can be applied. Otherwise ambiguity occurs, as I have indicated before.
I don't understand what you mean by this ambiguity. If you define the functions $e^{i\theta}$, $\cos\theta$ and $\sin\theta$ by their power series, then the real part of the power series for $e^{i\theta}$ is the power series for $\cos\theta$, and the imaginary part is the series for $\sin\theta$. So Euler's formula $e^{i\theta} = \cos\theta + i\sin\theta$ is practically a tautology.
 
  • #15
Opalg said:
If you define the functions $e^{i\theta}$, $\cos\theta$ and $\sin\theta$ by their power series ...

I am not defining them by their power series. They are defined as usual by the properties of right-angled triangle throughout the context. The power series can still be derived from it, though.
Opalg said:
I don't understand what you mean by this ambiguity.

That is the whole point of this problem. Most of the problem would be solved if I point this out to you, the tautology that occurs.
 
Last edited:
  • #16
mathbalarka said:
I am not defining them by their power series. They are defined as usual by the properties of right-angled triangle throughout the context.
Okay, if you want to define trig functions in terms of triangles, then you must first define what you mean by an angle. And if you want to define angles (as radians), you have to know what is meant by arc length. But arc length, on a curve $y=f(x)$, is defined in terms of an integral \(\displaystyle \int\sqrt{1+f'(x)^2}dx\). If you try to do that for the unit circle, you will find yourself having to integrate $\dfrac1{\sqrt{1-x^2}}$, which leads you to $\arcsin x$. This becomes a circular argument (if you'll forgive the pun), and you end up trying to use the inverse sine function in order to define the sine function. That is why a rigorous treatment of trig functions has to start from the power series definition.
 
  • #17
mathbalarka said:
Yes, but note that you are deriving sine and cosine while deriving $z(t)$, can you do that without the angle-sum identity?

- - - Updated - - -

All right!... I'll try to complete my answer considering again the complex function...

$\displaystyle z(t) = \cos t + i\ \sin t\ (1)$

... the derivative of which is...

$\displaystyle z^{\ '} (t) = \frac{d}{dx}\ \cos t + i\ \frac{d}{dx}\ \sin t\ (2)$

Now we suppose don't to know $\frac{d}{d t} \cos t$ and $\frac{d}{d t} \sin t$ and will try to arrive at them. A fundamenthal property of the complex functions extablishes that...

$\displaystyle \frac {d}{d t} \ln z(t) = \frac{z^{\ '}(t)}{z(t)}\ (2)$

... and in our case is...

$\displaystyle \frac{d}{d t} \ln z(t)= i\ (3)$

Inserting (1) and (2) in (3) we arrive directly to the equations...

$\displaystyle \frac{d}{d x} \cos t\ \cos t + \frac{d}{dx} \sin t\ \sin t =0$

$\displaystyle \frac{d}{d x} \sin t\ \cos t - \frac{d}{d x} \cos t\ \sin t = 1\ (4)$

... and the solutions of (4) are...

$\displaystyle \frac{d}{d x} \sin t = \cos t$

$\displaystyle \frac{d} {d x} \cos t = - \sin t\ (5)$

Kind regards

$\chi$ $\sigma$
 
  • #18
Opalg said:
This becomes a circular argument (if you'll forgive the pun), and you end up trying to use the inverse sine function in order to define the sine function. That is why a rigorous treatment of trig functions has to start from the power series definition.

Yes, I know of this tautology. In fact, geometry is a whole lot filled with tautology -- you are given a point. To define a line, you draw another point, but where do the second is coming from? Why, for you have an infinite number of points now I define them. Then where do you put 'em? Why, in a space -- and here comes the concept of a space in which all the points are kept, i.e., the \(\displaystyle \mathbb{R}^3\) and you lost everything you were keeping in order. That's why modern mathematics use better definitions than that of geometric ones. I know of this, but the scope of this challenge is not going into that mess. :D
 
  • #19
@chisigma : How do you derive (3)?
 
  • #20
mathbalarka said:
@chisigma : How do you derive (3)?

For any complex function is $\displaystyle \ln z = \ln |z| + i\ \text{arg}\ (z) $ so that in our case is...

$\displaystyle \ln z = \ln (\sin^{2} t + \cos^{2} t) + i\ \tan^{-1} \frac{\sin t}{\cos t} = i\ t\ (1)$

... so that is...

$\displaystyle \frac{d}{d t} \ln z = \frac{z^{\ '}(t)}{z(t)} = i\ (2)$

Kind regards

$\chi$ $\sigma$
 
  • #21
$(1)$ is true if and only if $\sin(t) > 0$.
 
  • #22
mathbalarka said:
$(1)$ is true if and only if $\sin(t) > 0$.

Of course is...

$\displaystyle i\ \tan^{-1} \frac{\sin t}{\cos t} = i\ \tan^{- 1} \tan t = i\ (t + k\ \pi)\ (1)$

... but in any case the derivative of the constant term is 0, so that in any case is ...

$\displaystyle \frac{d}{d t} \ln z = \frac{z^{\ '} (t)}{z(t)} = i\ (2)$

Kind regads

$\chi$ $\sigma$
 
  • #23
Well done! Your proof is correct, I like this one. In fact, mine is rather cumbersome compared to yours.

Thanks very much for participating,
Balarka
.
 
  • #24
Opalg said:
Okay, if you want to define trig functions in terms of triangles, then you must first define what you mean by an angle. And if you want to define angles (as radians), you have to know what is meant by arc length. But arc length, on a curve $y=f(x)$, is defined in terms of an integral \(\displaystyle \int\sqrt{1+f'(x)^2}dx\). If you try to do that for the unit circle, you will find yourself having to integrate $\dfrac1{\sqrt{1-x^2}}$, which leads you to $\arcsin x$. This becomes a circular argument (if you'll forgive the pun), and you end up trying to use the inverse sine function in order to define the sine function. That is why a rigorous treatment of trig functions has to start from the power series definition.

I almost agree with this.

It is actually possible to define the cosine function by first defining:

$\displaystyle A: [-1,1] \to \Bbb R,\ A(x) = \frac{x\sqrt{1-x^2}}{2} + \int_x^1 \sqrt{1 - t^2}\ dt$

and then defining cosine on the interval:

$\displaystyle [0, 2\int_{-1}^1 \sqrt{1 - u^2}\ du]$

as the unique (function of) $x$ such that:

$(A \circ \cos)(x) = \dfrac{x}{2}$

and then finally, defining:

$\displaystyle \pi = 2\int_{-1}^1 \sqrt{1 - u^2}\ du$

and on $[0,\pi]$ defining:

$\sin(x) = \sqrt{1 - (\cos x)^2}$.

At this point we extend the domain of these two functions by defining, for $x \in (\pi,2\pi]$:

$\cos(x) = \cos(2\pi - x)$
$\sin(x) = -\sin(2\pi - x)$,

and finally extend by periodicity.

Now, granted this construction bears little resemblence to the usual geometric definitions of the trigonometric functions, and why we would do such a thing is a bit unmotivated, but it IS rigorous, and it DOESN'T use power series.

One advantage of this method is that angle-sum identities are no longer required to derive the derivatives of these two functions, it follows (more or less straight-forwardly) that:

$\cos'(x) = -\sin x$
$\sin'(x) = \cos x$

by applying the inverse function theorem to $B = 2A$ (first on the interval $[0,\pi]$ and then using the "extended" definitions to compute the derivatives on all of $\Bbb R$...care has to be taken to apply the correct one-sided limits at the "stitch points", of course).

A clear disadvantage of this, is that it is NOT clear how to extend these definitions analytically to $\Bbb C$, but now the Taylor series in Zaid's posts can be computed without fear of reference to the angle-sum formulae, and THOSE series can easily be shown to be convergent on $\Bbb C$.

Just sayin'
 
Last edited:
  • #25
Deveno said:
I almost agree with this.

It is actually possible to define the cosine function by first defining:

$\displaystyle A: [-1,1] \to \Bbb R,\ A(x) = \frac{x\sqrt{1-x^2}}{2} + \int_x^1 \sqrt{1 - t^2}\ dt$

and then defining cosine on the interval:

$\displaystyle [0, 2\int_{-1}^1 \sqrt{1 - u^2}\ du]$

as the unique $x$ such that:

$(A \circ \cos)(x) = \dfrac{x}{2}$

and then finally, defining:

$\displaystyle \pi = 2\int_{-1}^1 \sqrt{1 - u^2}\ du$

and on $[0,\pi]$ defining:

$\sin(x) = \sqrt{1 - (\cos x)^2}$.

At this point we extend the domain of these two functions by defining, for $x \in (\pi,2\pi]$:

$\cos(x) = \cos(2\pi - x)$
$\sin(x) = -\sin(2\pi - x)$,

and finally extend by periodicity.

Now, granted this construction bears little resemblence to the usual geometric definitions of the trigonometric functions, and why we would do such a thing is a bit unmotivated, but it IS rigorous, and it DOESN'T use power series.

One advantage of this method is that angle-sum identities are no longer required to derive the derivatives of these two functions, it follows (more or less straight-forwardly) that:

$\cos'(x) = -\sin x$
$\sin'(x) = \cos x$

by applying the inverse function theorem to $B = 2A$ (first on the interval $[0,\pi]$ and then using the "extended" definitions to compute the derivatives on all of $\Bbb R$...care has to be taken to apply the correct one-sided limits at the "stitch points", of course).

A clear disadvantage of this, is that it is NOT clear how to extend these definitions analytically to $\Bbb C$, but now the Taylor series in Zaid's posts can be computed without fear of reference to the angle-sum formulae, and THOSE series can easily be shown to be convergent on $\Bbb C$.

Just sayin'
Very ingenious! But I can't help thinking that power series are neater and more useful.

The only awkward thing about the power series approach is that (as in your method) the definition of $\pi$ is a bit messy. You have to show that $\cos x$ (as defined by its power series) is a decreasing function on the interval [0,2], with $\cos 0 = 1>0$ and $\cos2<0$. It follows from the intermediate value theorem that there is a unique number $x_0$ between 0 and 2 such that $\cos x_0 = 0$. The number $\pi$ is then defined to be $2x_0.$
 
  • #26
How about working with the differential equations itself? Sine and cosine both satisfies the LDE

$$y'' = -y$$

with initials values at $z = 0$ of the function and the differentiated function $(1, 0)$ and $(0, 1)$ respectively. This is a doable definition, also the Taylor series can be plausibly generated from these two at $z = 0$ without the use of angle sum or any of the trigonometric identities.

Anyhow, I like to think that the trigonometric functions are a result of their geometric definitions (and they really are). If one twiddles with the definition, one can do anything to it. Why not define sine and cosine as

$$ \cos(x) + i\sin(x) = e^{ix} $$
$$ \cos(x) - i\sin(x) = e^{ix} $$

Much more doable, strong and ingenious definition. But my problem follows trivially (Dull).

At this point, the problem can be much more cleverly restated by defining them similarly as in Galois fields. Define them as thus :

$$\cos(x)^2 + \sin(x)^2 = 1, x \in \mathbb{R}$$

But that is not unique at all. One can, for example, take the formulas of generating Pythagorean triples and make the pair of periodic functions a non periodic rational function! Try again, define them as :

$$\sin(x + y) = \cos(x)\sin(y) + \sin(x)\cos(y)$$
$$\cos(x + y) = \cos(x)\cos(y) - \sin(x)\sin(y)$$
$$0 < x \cos(x) < \sin(x) < x, \; 0 < x < 1$$

Voila! You have defined sin and cos uniquely and now you have to prove the Euler's identity without the angle sum fomulas! Wait, what?! But they are the angle sum formulas, how can you define sin and cos without them?

Well, now you have it, prove that $x = x$ whereas $x$ doesn't exist. Everything is vacuously true at this point.:D

Thus, as a conclusion, my challenge problem is utter nonsense, sine and cosine aren't uniquely defined without the angle sum formulas if you want to define it geometrically. If they aren't defiend geometrically, my problem is then trivial. Hey, I am closing into Gaussian philosophy!(see my .sig)
 
Last edited:
  • #27
There's absolutely nothing wrong with using power series, it's an elegant approach, and quite satisfying.

But as I understand it, this entire thread is about avoiding proving something which uses that something in the course of the proof (hidden perhaps several layers deep).

We've probably all seen a more transparent example, where the well-ordering theorem is used to justify (perhaps informally) induction on the natural numbers, and then later the well-ordering theorem proved by induction!

So I wanted to point out that using power series was not the ONLY way to avoid this trap (it IS however, a very NICE way). That's why I "almost" agreed.

Mathematics is an interesting creature: Often the things we take as "facts" are built upon things which were never proven in the first place...high school students often become quite proficient in the use of real numbers, for example, without ever realizing that basic things they take as true, like:

$a^2 > 0$ for every real number $a \neq 0$

arise from properties of the real numbers due to their extension from the rationals, which in turn get THEIR ordering from the integers:

$\displaystyle \frac{a}{b} > \frac{c}{d} \iff ac > bd$

and the integers, in turn get their ordering from the well-ordering of the natural numbers.

The point being, if a high-school student were to question, then: "Why are the natural numbers well-ordered?", they may be likely to receive a response: "They just are". I was 19 before I ever saw an axiomatic development of the natural numbers, and I was quite a bit older before I understood that the logical consistency of such an axiomatic system was essentially undecidable. We use it because it works (the essence of set theory in as few words as possible).

So, it's common practice to take a fair amount of material "on faith", which is a bit strange, considering we spend a great deal of time and energy trying to do just the opposite: the entire purpose of being "rigorous" is to put our statements on firm footing.

For these reasons (among others), I am not entirely convinced that the real numbers (for example) DO EXIST, although I am perfectly happy to say that IF THEY DO, they form a complete ordered field.

A similar point could be made about power series: they are an abstraction, and we can't point to something in the real world and say: THAT is a power series. But they are USEFUL abstractions, and can offer insight into things (such as wave motion) that DO exist.

Euler's theorem is really a very beautiful theorem, right up there with a (similar in some ways) theorem of Pythagoras. It is a GOOD thing we can prove it, and the difficulties involved in doing so without self-reference is (in my humble opinion) something of a testament to how deep it cuts. The numbers $e$ and $\pi$ turn out to be brothers, after all...there is something very satisfying about this.

And I daresay, the fact that we can exploit Euler's theorem algebraically AND geometrically, enriches the CONTEXT we study it in. Not only can we formally manipulate strings for ease of calculation, we can draw really nifty pictures which keeps both sides of our brains happy (Interestingly enough, historically, resistance to complex numbers was rather high until the advent of the Argand diagram, and then everybody was like: oh...OH!).

**********

All that said, the only practical advantage to using my definition is that it enables the real trigonometric functions to be tackled as soon as one has a basic understanding of one-variable calculus, and one usually undertakes calculus BEFORE studying complex analysis (which really is a better forum for discussing power series because they converge in radii, which makes more sense than "intervals"). Of course, it IS possible to introduce power series relatively early in a calculus course, and this is somewhat "natural" if one uses Cauchy sequences of rationals to DEFINE real numbers. But the deeper connection between analytic functions and power series will have to be deferred until a later date in such an approach.

While I am impressed with chisigma's approach, I can't help but feel there is a "devil in the woodpile" lurking in his invokation of the arg function, it has, after all, discontinuities (and therefore so does any complex log function) whereas the exponential function (and the trig functions) do not possesses this particular defect (and certainly we would want to apply Euler's theorem on an entire circle).

A third approach I have seen develops the (complex) exponential function and trig functions as certain complex differential equation solutions, namely:

sine is the solution to:

$f + f'' = 0$
$f(0) = 0$
$f'(0) = 1$

cosine is the solution to:

$f + f" = 0$
$f(0) = 1$
$f'(0) = 0$

and the complex exponential is the solution to:

$f - f' = 0$
$f(0) = 1$

It can be shown that these equations characterize unique complex functions, and we can then use the approach outlined in post #2.

Of course, this puts studying differential equations (and thus possibly linear algebra) as a prerequisite for our proof.

Of course the "hard part" of the "easier" proofs of Euler's theorem amount to showing these "new" definitions of the trig functions AGREE with the values for the functions as we originally came to learn them, from geometry. This CAN be done, but often requires a bit of ingenuity in its own right.
 
  • #28
Deveno said:
Euler's theorem is really a very beautiful theorem, right up there with a (similar in some ways) theorem of Pythagoras. It is a GOOD thing we can prove it, and the difficulties involved in doing so without self-reference is (in my humble opinion) something of a testament to how deep it cuts. The numbers $e$ and $\pi$ turn out to be brothers, after all...there is something very satisfying about this.

You go very deep into axiomatized system of transcendental numbers by saying so, I don't no whether you realize this as it is not your field. Yes, in fact, $\pi$ and $e$ are brothers, indeed, something which surprises even the ones who finds mathematics completely useless and boring.

But furthermore, $i$ is their brother too, as one can see. These three brothers (or sisters, but I like to think the former) are glued together by an ingenious relation, yes, quite ingenious indeed. But the relation is not just ingenious, it is also a key to some deep theory of transcendental analysis, and the whole algebraic and analytic number theory.

The relation $e^{i \pi} = -1$ has some of it's own beauties, along with the fact that it is the mastermind of the game. It's unique in the exponential rings, yes, since writing it another way, by taking logs to both sides won't change the identity.

But nevertheless, it is also unique in another sense, actually. If Schanuel's conjecture is true, the only relation between $e$, $\pi$ and $i$ in exponential rings, along with some other conditions, is Euler's identity. Funny how the ordinary became extraordinary, no? I, at least, would be very pleased to see this proven, as it would establish that this particular work done by Euler is truly beautiful in a sense.

EDIT : Okay, seems to be pulling all the people here out of the original challenge. Apologies if so. :p
 
Last edited:
  • #29
mathbalarka said:
... okay, seems to be pulling all the people here out of the original challenge. Apologies if so... :p

In fact the Euler's identity is one of the 'goldenkeys' of Math and I'm not surprised that Your post is the starting point of very interesting discussions. In this topic two different 'right definitions' of the function $\sin x$ have been presented and both these definition disregard the 'geometrical meaning' of the function. One definition is...

$\displaystyle \sin x = \sum_{n=0}^{\infty} (-1)^{n}\ \frac{x^{2 n + 1}}{(2 n + 1)!}\ (1)$

Another definition represents the function $\sin x$ as the solution of the ODE...

$\displaystyle y^{\ ''} = - y,\ y(0)=0,\ y^{\ '} (0) = 1\ (2)$

Very well!... but for my 'point of view of things' it is difficult to imagine a comfortable way, starting from definition (1) or (2), to demonstrate the basic properties of thye function... for example that for any real x is $\sin x = \sin (x + 2\ \pi)$... I do hope in some help from somebody...

Kind regards

$\chi$ $\sigma$
 
  • #30
chisigma said:
Very well!... but for my 'point of view of things' it is difficult to imagine a comfortable way, starting from definition (1) or (2), to demonstrate the basic properties of thye function

Exactly. They do bypass some tautologies and ambiguities but none of these are in fact quite "right" to me. This is exactly the way one feels about analytic continuations of different functions - some of those are unique but not very, well, correct. This is, at least, my point of view on this which grew greatly from the time I studied tetration.
 
  • #31
chisigma said:
In fact the Euler's identity is one of the 'goldenkeys' of Math and I'm not surprised that Your post is the starting point of very interesting discussions. In this topic two different 'right definitions' of the function $\sin x$ have been presented and both these definition disregard the 'geometrical meaning' of the function. One definition is...

$\displaystyle \sin x = \sum_{n=0}^{\infty} (-1)^{n}\ \frac{x^{2 n + 1}}{(2 n + 1)!}\ (1)$

Another definition represents the function $\sin x$ as the solution of the ODE...

$\displaystyle y^{\ ''} = - y,\ y(0)=0,\ y^{\ '} (0) = 1\ (2)$

Very well!... but for my 'point of view of things' it is difficult to imagine a comfortable way, starting from definition (1) or (2), to demonstrate the basic properties of thye function... for example that for any real x is $\sin x = \sin (x + 2\ \pi)$... I do hope in some help from somebody...

Kind regards

$\chi$ $\sigma$

If we were to decide to define $\sin$ as the unique non-zero (twice-differentiable, of course) function $f$ such that:

$f + f'' = 0$ (1)
$f(0) = 0$
$f'(0) = 1$.

It becomes obvious that:

$f'$ also satisfies (1) (by differentiating both sides). We shall henceforth call $f' = \cos$.

Now suppose a function $f$ satisfies:

$f + f'' = 0$
$f(0) = 0$
$f'(0) = 0$.

I claim that then $f = 0$. Proof:

Multiplying (1) by $f'$, we get:

$(f')(f + f'') = 0$
$2(ff'' + f'f'') = 0$
$((f')^2 + f^2)' = 0$

Thus $(f')^2 + f^2$ is a constant, and $f(0) = 0,f'(0) = 0$ imply this constant is 0. This in turn means:

$[f'(x)]^2 + [f(x)]^2 = 0$ for all $x$, which means $f(x) = 0$ for all $x$.

Now, suppose:

$f + f'' = 0$
$f(0) = a$
$f'(0) = b$

I claim $f = a\cos + b\sin$

Proof: Let $g = f - a\cos - b\sin$

Then $g' = f' - a(\cos)' - b(\sin)' = f' - a((\sin)')' - b\cos$

$= f' - a(\sin)'' - b\cos = f' + a\sin - b\cos$ (by (1)).

Now $g'' = f'' + a(\sin)' - b(\cos)' = f'' + a\cos + b\sin$, so:

$g + g'' = f + f'' = 0$,
$g(0) = f(0) - a\cos(0) - b\sin(0) = a - a + 0 = 0$
$g'(0) = f'(0) + a\sin(0) - b\cos(0) = b + 0 - b = 0$

Hence $g = 0$, by our earlier result.

Now we are in a position to PROVE the angle-sum formula, establishing that these "really are" our usual trig functions:

Let $y$ be any real number. For each such number, we can define a function:

$f(x) = \sin(x + y)$ and:

$f + f'' = 0$
$f(0) = \sin y$
$f'(0) = \cos y$

So we have that $f = \sin y\cos + \cos y\sin$ that is:

$\sin(x + y) = \sin y \cos x + \cos y \sin x$

which holds for all $x$ and any real number $y$ (thus all of them).

The angle-sum identity for cosine can now be found in a similar fashion.

As Opalg mentioned earlier, producing $\pi$ requires the most work: the easiest way is probably to show that $\cos$ cannot be > 0 for all $x > 0$ so there is a smallest positive 0 for $\cos$ (the properties of the first and second derivatives of cosine show it is concave downwards at 0).

Clever use of the angle-sum formulae can then be used to establish the periodicity.
 
  • #32
I didn't write that there is no way, using a 'non geometrical' definition of $\sin x$, to demonstrate that is $\sin x = \sin (x + 2\ \pi)$... I wrote that in my opinion there isn't a comfortable way to do it...Kind regards $\chi$ $\sigma$
 
Last edited:
  • #33
I agree that sometimes the "rigorous" way isn't the "intuitive" way. We have to, in some of these discussions, take the "long way 'round" to prove something immediately evident from a diagram.

One could view this state of affairs as a defect of analysis, or: as a tribute to the subtlety of geometry. :)
 
  • #34
Okay, this was something new. I seem to have found out a very elegant proof of the challenge problem, please review my proof for errors :

Using the limit definition of the transcendental constant $e$, we can derive

$$e^{iz} = \lim_{n \to \infty} \left ( 1 + i \frac{z}{n} \right )^n$$

The convergence of limit follows from binomial theorem, which uses basically the factorial function. Our goal is to prove that $e^{iz}$ is on the circumference of the unit circle on $\mathbb{C}$. This is proved by choosing

$$ \left | \left ( 1 + i \frac{z}{n} \right )^n \right | = \left ( \left | 1 + i \frac{z}{n} \right | \right )^n = \left ( 1 - \frac{z^2}{n^2} \right )^{n/2} $$

And showing that the limit goes to $1$ as $n$ tends $\infty$ which is quite straightforward from the definitions of the limit. Hence we have $\left |e^{iz}\right | = 1$, the desired.

Now, as every complex number situated in the circumference of the unit circle is of the form $\cos(\theta) + i\sin(\theta)$, which follows from the geometric definition of $\sin$ and $\cos$, the problem is proved.
 
Last edited:
  • #35
mathbalarka said:
Okay, this was something new. I seem to have found out a very elegant proof of the challenge problem, please review my proof for errors :

Using the limit definition of the transcendental constant $e$, we can derive

$$e^{iz} = \lim_{n \to \infty} \left ( 1 + i \frac{z}{n} \right )^n$$

The convergence of limit follows from binomial theorem, which uses basically the factorial function. Our goal is to prove that $e^{iz}$ is on the circumference of the unit circle on $\mathbb{C}$. This is proved by choosing

$$ \left | \left ( 1 + i \frac{z}{n} \right )^n \right | = \left ( \left | 1 + i \frac{z}{n} \right | \right )^n = \left ( 1 - \frac{z^2}{n^2} \right )^n $$

And showing that the limit goes to $1$ as $n$ tends $\infty$ which is quite straightforward from the definitions of the limit. Hence we have $\left |e^{iz}\right | = 1$, the desired.

Now, as every complex number situated in the circumference of the unit circle is of the form $\cos(\theta) + i\sin(\theta)$, which follows from the geometric definition of $\sin$ and $\cos$, the problem is proved.

Proving that for all z is $|e^{i\ z}|=1$ You don't prove that is $e^{i\ z} = \cos z + i\ \sin z$... for example it could be $e^{i\ z}= 1$...

Kind regards

$\chi$ $\sigma$
 

Similar threads

Replies
3
Views
994
Replies
1
Views
1K
Replies
1
Views
2K
Replies
1
Views
1K
Replies
5
Views
1K
Replies
2
Views
3K
Replies
4
Views
2K
Back
Top