(Fitzpatrick Advanced Calculus 2e, Sec. 2.4 #12)
For c>0, consider the quadratic equation
x2−x−c=0,x>0.
Define the sequence {xn} recursively by fixing |x1|<c and then, if n is an index for which xn has been defined, defining
xn+1=√c+xn
Prove that the sequence {xn} converges monotonically to the solution of the above equation.
Note: The answers below might assume x1>0, but they still work, as we have x3>0.
This is being repurposed in an effort to cut down on duplicates, see here: Coping with abstract duplicate questions.
and here: List of abstract duplicates.
Answer
Assuming that you know that a monotone, bounded sequence converges, you want to do two things. First, show that ⟨xn:n∈Z+⟩ is monotone and bounded, and then show that its limit is the positive root of x2−x−c=0.
If c=x1=1, x2=√2>x1, while if c=1 and x1=2, $x_2=\sqrt3
The positive root of the quadratic is 12(1+√1+4c), which I’ll denote by r. If xn→r, as claimed, and does so monotonically, it must be the case that the sequence increases monotonically if $x_1
This suggests that your first step should be to show that if $x_n
Suppose that $0\le x_n
Once this is done, you still have to show that the limit of the sequence really is r. Let f(x)=√c+x; clearly f is continuous, so if the sequence converges to L, we have L=limn→∞xn=limn→∞xn+1=limn→∞f(xn)=f(L),
Added: Note that although the problem gave us x1>0, this isn’t actually necessary: all that’s needed is that x1≥−c, so that x2 is defined, since x2=√c+x1≥0 automatically.
No comments:
Post a Comment