- #71
ramsey2879
- 841
- 3
I supposed that calculus back in high school in the 1960's may have treated infinity differently from how the treat it today. I believe that all we had to do was simply recognized that infinity was like a much much bigger number than A to deduce that A/[infinity] was 0. PS I don't understand Micromass's math notation as I never had much math beyond High School.Hurkyl said:No, to get the limit of A/n as n goes to infinity, you need to know the Archimedean principle.
To compute the limit of A/n as n goes to infinity instead by comparing to A/[infinity], you need a lot more information. One set of information would be
(For the record, my thought processes probably would compute the limit by invoking continuity of division in the projective real numbers before any other approach)
- A number system containing an element called [infinity] along with all real numbers
- Knowledge that A/[infinity] = 0 if A is finite
- Knowledge that division in this new number system gives the same results as division in the real numbers, when both numbers are real
- Knowledge that division is continuous in this new number system (at least, at (A, [infinity]))
- Knowledge that the limit of n as n goes to infinity converges to [infinity]
- Knowledge that limits computed in this new number system agree with limits computed in the real numbers when it would make sense.
Last edited: