I am trying to solve the following problem:
Compute lim when \lim_{n \to \infty} a_n^n=a>0 and \lim_{n \to \infty} b_n^n=b>0 such that a_n,b_n>0 \ \forall \ n \ \in \mathbb{N}.
I tried to use the Sandwich Theorem to come up with an answer, but my upper bound was not tight:
\max(a_n,b_n)\ge(\frac{a_n+b_n}{2}) \ge \sqrt{a_nb_n}
On passing to the limits I got the following:
\max(a,b)\ge \lim_{n \to \infty}(\frac{a_n+b_n}{2}) \ge \sqrt{ab}
But this doesn't help me at all. How could I actually compute the limit?
Answer
This is done in steps. First we have to note that both a_n, b_n tend to 1. This follows from the fact that n\log a_n\to \log a and thus \log a_n\to 0.
Next we can let x_n denote the expression whose limit is to be evaluated here. Then we have \log x_n=n\log \left(1+\frac{a_n+b_n-2}{2}\right) and the limit of above expression is same as that of \frac{1}{2}\cdot\{n(a_n-1)+n(b_n-1)\} Next we can use the fact that n\log a_n\to\log a which implies n(a_n-1)\to\log a. The limit of \log x_n is thus equal to \frac{\log a +\log b} {2} It follows that x_n\to\sqrt{ab} .
The above argument makes use of the standard limit \lim\limits_{x\to 1}\dfrac{\log x} {x-1}=1.
No comments:
Post a Comment