- #1
iScience
- 466
- 5
i understand the reason for if the evaluated integral converges then the series also converges. However, just because the evaluated integral diverges, why does this automatically mean that the related series also diverges?
the integral consists of every number of the +x axis when evaluating f(x), the function's convergence or divergence to be determined. so if the sum of the f(x) at every single x point/interval on the +x axis is evaluated to converge, then this obviously means that the series would converge because the series consists of much less f(x) values to be summed up, even though the sequences for both the integral and the series function follow the same trend, ie converge to the same value.
However.. for the integral test, when there exists a function f(x) whose evaluated integral turns out to diverge, why does this automatically mean that the series (whose sum consists of much less f(x) values..) also diverges?
the integral consists of every number of the +x axis when evaluating f(x), the function's convergence or divergence to be determined. so if the sum of the f(x) at every single x point/interval on the +x axis is evaluated to converge, then this obviously means that the series would converge because the series consists of much less f(x) values to be summed up, even though the sequences for both the integral and the series function follow the same trend, ie converge to the same value.
However.. for the integral test, when there exists a function f(x) whose evaluated integral turns out to diverge, why does this automatically mean that the series (whose sum consists of much less f(x) values..) also diverges?