Wednesday, July 6, 2016

elementary number theory - If $a,b inmathbb N$ and $gcd(a,b)=1$, prove that $gcd(a+b;a^2+b^2)= 1$ or $2$.


If $a,b \in\mathbb N$ and $\gcd(a,b)=1$, prove that $\gcd(a+b,a^2+b^2)$ is always equal to either 1 or 2, where $\gcd$ is the greatest common divisor. I haven't really ever solved a problem like this before, so I'd like to get some help. Thanks.


Answer



If integer $d$ divides both $a+b,a^2+b^2$


$d$ must divide $a^2+b^2+ (a-b)(a+b)=2a^2$


Similarly, $d$ must divide $a^2+b^2- (a-b)(a+b)=2b^2$


$\implies d$ must divide $(2a^2,2b^2)=2(a^2,b^2)$


As $(a,b)=1,(a^2,b^2)=1$


The GCD will be $1$ or $2$ according as $a,b$ are of opposite or same parity.


No comments:

Post a Comment

analysis - Injection, making bijection

I have injection $f \colon A \rightarrow B$ and I want to get bijection. Can I just resting codomain to $f(A)$? I know that every function i...