Tuesday, March 27, 2018

Limit of recurrence sequence


I have to find a limit (or prove it doesn't exist) for the following recurrence sequence.


$a_1 = 2; a_{n+1} = \frac{1}{2}(a_n + \frac{2}{a_n})$


Now I know, in order to find the limit, I first need to prove that the sequence is monotonic and bounded. I've made a partial table of values and concluded that the sequence is decreasing, thus to prove monotonicity, I've written down:


$ a_{n+1} < a_n \rightarrow a_n > \sqrt{2} $



And that's all I could think of. I don't think the inequality above proves anything so I don't know how to continue. I tried to calculate limit of the sequence by using limits of elements as follows:


$ \lim a_{n+1} = \frac{1}{2}(\lim a_n + \lim \frac{2}{a_n}) = a\Rightarrow a = \sqrt{2}$


But without proving monotonicity and bounding, there's no proof the limit exists at all.


Thank you for any help in advance.


Answer



That is the Babylon's algorithm use to extract the root of a real number, hence the limit you found is good. Have you seen how to study sequences of the form $a_{n+1}=f\left(a_n\right)$ ? You can also focus on $\displaystyle \frac{a_{n+1}}{a_n}$ since the sequence is never null.


No comments:

Post a Comment

analysis - Injection, making bijection

I have injection $f \colon A \rightarrow B$ and I want to get bijection. Can I just resting codomain to $f(A)$? I know that every function i...