My question looks quite obvious, but I'm looking for a strict proof for this. (At least, I assume it's true what I claim.)
Why can't the sum of two cube roots of positive non-perfect cubes be an integer?
For example: 3√100+3√4 isn't an integer. Well, I know this looks obvious, but I can't prove it...
For given numbers it will be easy to show, by finding under- and upper bounds for the roots (or say take a calculator and check it...).
Any work done so far:
Suppose 3√m+3√n=x, where x is integer. This can be rewritten as m+n+3x3√mn=x3 (by rising everything to the power of 3 and then substituting 3√m+3√n=x again) so 3√mn is rational, which implies mn is a perfect cube (this is shown in a way similar to the well-known proof that √2 is irrational.).
Now I don't know how to continue. One way is setting n=a3m, which gives m2+a3+3amx=mx3 but I'm not sure whether this is helpful.
Maybe the solution has to be found similar to the way one would do it with a calculator: finding some bounds and squeezing the sum of these roots between two well-chosen integers. But this is no more then a wild idea.
Answer
Suppose a+b=c, so that a+b−c=0, with a3,b3,c all rational.
Then we have −3abc=a3+b3−c3 by virtue of the identity a3+b3+c3−3abc=(a+b+c)(a2+b2+c2−ab−ac−bc) (take c with a negative sign)
Hence a+b and ab are both rational, so a and b satisfy a quadratic equation with rational coefficients.
There are lots of ways of completing the proof from here.
No comments:
Post a Comment