Friday, August 28, 2015

elementary number theory - How can I prove that gcd(a,b)=1impliesgcd(a2,b2)=1 without using prime decomposition?


How can I prove that if gcd(a,b)=1, then gcd(a2,b2)=1, without using prime decomposition? I should only use definition of gcd, division algorithm, Euclidean algorithm and corollaries to those. Thanks for your help!


Answer



The golden rule for all problems about greatest common divisors, I claim (especially if you're specifically trying to avoid using prime decomposition), is the fact that gcd(a,b) is equal to the smallest positive integer g that can be written in the form g=ax+by with x,y integers.


In particular, gcd(a,b)=1 if and only if there exist integers x and y such that ax+by=1. (This is not true with 1 replaced by a larger integer! It's true because 1 is the smallest positive integer there is.)


That's my general hint; my specific hint is to cube both sides of the equation ax+by=1.


No comments:

Post a Comment

analysis - Injection, making bijection

I have injection f:AB and I want to get bijection. Can I just resting codomain to f(A)? I know that every function i...