- #1
skate_nerd
- 176
- 0
Sorry to spam my problems all over this forum but series have me struggling somewhat. Last problem on my homework is the sequence an defined recursively by:
a1=1
and
an+1= \(\frac{a_n}{2}\) + \(\frac{1}{a_n}\)
First part was the only part i know how to do. it was to find an for n=1 through 5.
However this next part has me stumped. Assume that:
The limit as n approaches infinity = alpha > 0
Obtain the value alpha by taking the limit of both sides of (1) [(1) is the info given in the beginning].
Now I can already tell this limit is going to \(\sqrt{2}\) because it is kind of hinted to in the later parts of this problem. However I am kind of confused as to how to approach doing this limit of both sides of an+1= \(\frac{a_n}{2}\) + \(\frac{1}{a_n}\) . How would I evaluate this limit when the right side of the equation is in terms of an?
a1=1
and
an+1= \(\frac{a_n}{2}\) + \(\frac{1}{a_n}\)
First part was the only part i know how to do. it was to find an for n=1 through 5.
However this next part has me stumped. Assume that:
The limit as n approaches infinity = alpha > 0
Obtain the value alpha by taking the limit of both sides of (1) [(1) is the info given in the beginning].
Now I can already tell this limit is going to \(\sqrt{2}\) because it is kind of hinted to in the later parts of this problem. However I am kind of confused as to how to approach doing this limit of both sides of an+1= \(\frac{a_n}{2}\) + \(\frac{1}{a_n}\) . How would I evaluate this limit when the right side of the equation is in terms of an?