I'm a PreCalculus student trying to find a rigorous proof that $\displaystyle\frac{1}{3} = 0.333\ldots$, but I couldn't find it. I think (just think) that this proof would start by proving that
$\displaystyle\sum_{i=1}^{\infty}3\cdot10^{-i} = \frac{1}{3}$. My guesses (assuming that proving that $\displaystyle\sum_{i=1}^{\infty}\left(\frac{1}{10}\right)^i$ converges is trivial):
$\displaystyle\sum_{i=1}^{\infty}3\cdot10^{-i} = 3\cdot\sum_{i = 1}^{\infty}10^{-i} = 3\cdot\sum_{i=1}^{\infty}\left(\frac{1}{10}\right)^i = 3\cdot\left(\frac{1}{1 - \frac{1}{10}}-1\right) = 3\cdot\left(\frac{10}{9}-1\right) = \frac{1}{3}$.
Questions: is this completely rigorous? Which flaws could be found in this proof? How can I improve it?
PS. I'm not sure how to tag this. Feel free to edit, if necessary.
Answer
Since $\sum_{i=1}^\infty 3\cdot 10^{-i}$ is what the notation "$0.333...$" means, your argument is perfectly good. It's not just the "start of a proof", it is all there is to it.
Okay, perhaps it is not really trivial to prove that the geometric series converges, but straightforward it is. Just plug in the definition of the sum of a series and crank the handle, using standard tricks to rewrite each of the partial sums in turn.
No comments:
Post a Comment