- #1
Lengalicious
- 163
- 0
Homework Statement
determine the radius of convergence of the series expansion of log(a + x) around x = 0
Homework Equations
The Attempt at a Solution
So after applying the Taylor series expansion about x=0 we get log(a) + SUM[(-1)^n x^n/(n a^n)] I understand how to get the radius of convergence for log(1+x) for instance using the ratio test, but with 1 being replaced with what I believe to be an arbitrary constant I am now confused as to how I should tackle this, any help would be great, thanks in advance!
Would I employ the ratio test in the same way? I have a log in the numerator and denominator so not really sure...