- #1
fakecop
- 42
- 0
Can someone please explain how the taylor series would work if x, the given value from the function, is equal to a, the value at which you expand the function?
For example, let's take 1/(1-x) as an example. The taylor series for this with a=0 is Ʃ(n from 0 to infinity) x^n. But if we let x=a=0, then the first term of this sequence is 0^0, or indeterminate.
Am i supposed to take the limit of x^0 as x approaches 0-which is one in this case, or always assume x^0 to be equal to 1, or follow some other formula?
For example, let's take 1/(1-x) as an example. The taylor series for this with a=0 is Ʃ(n from 0 to infinity) x^n. But if we let x=a=0, then the first term of this sequence is 0^0, or indeterminate.
Am i supposed to take the limit of x^0 as x approaches 0-which is one in this case, or always assume x^0 to be equal to 1, or follow some other formula?