- #1
led5v
- 3
- 0
I have across the following argument, which seems wrong to me, in a larger proof (Theorem 4 on page 9 of the document available at http://www.whitman.edu/mathematics/SeniorProjectArchive/2011/SeniorProject_JonathanWells.pdf). I would appreciate if someone can shed light on why this is true.
The argument is that given a sequence $a_k$ of points in [a,b], we can say that a sub-interval of [a,b] exists such that it is smaller than some value $g<b-a$ and contains an infinite number of terms from $a_k$.
I disagree with the above statement because let's say that the sequence $a_k$ always returns a constant value, say b. Then the above statement doesn't hold.
The argument is that given a sequence $a_k$ of points in [a,b], we can say that a sub-interval of [a,b] exists such that it is smaller than some value $g<b-a$ and contains an infinite number of terms from $a_k$.
I disagree with the above statement because let's say that the sequence $a_k$ always returns a constant value, say b. Then the above statement doesn't hold.