- #1
Aspiring
- 7
- 0
If we set out to prove the irrationality of the natural logarithm of π (pie), by writing out the Taylor series centered at zero for the function y=π^x, with x=1, we have:
π=1+Sum(ln^k(π)/k!) from k=1 to infinity.
Since we know π is irrational, then ln(π) must be irrational or otherwise π=(a+b)/b
For integer a and b, why is this not correct?
π=1+Sum(ln^k(π)/k!) from k=1 to infinity.
Since we know π is irrational, then ln(π) must be irrational or otherwise π=(a+b)/b
For integer a and b, why is this not correct?