- #1
brush
- 3
- 0
Hi everyone
When determining the radius of convergence of a power series, when should I use the ratio (a[sub n+1] / a[sub n]) test versus the root (|a[sub n]|^(1/n)) test?
I know that I'm supposed to use the ratio only when there are factorials, but other than that, are these tests basically interchangeable?
Also, are there any differences in usage/application of the tests in the context of determining the radius of convergence of a power series?
Thanks
When determining the radius of convergence of a power series, when should I use the ratio (a[sub n+1] / a[sub n]) test versus the root (|a[sub n]|^(1/n)) test?
I know that I'm supposed to use the ratio only when there are factorials, but other than that, are these tests basically interchangeable?
Also, are there any differences in usage/application of the tests in the context of determining the radius of convergence of a power series?
Thanks
Last edited: