This is a problem for a graduate level discrete math class that I'm hoping to take next year (as a senior undergrad). The problem is as stated in the title:
Given that gcd(k,l)=d, prove that gcd(2k−1,2l−1)=2d−1.
The problem also says "hint: use Euclid's lemma," although I'm not sure what part of the problem should be applied to. I've been thinking about it for a few days and I'm completely stumped.
I'm not really even sure how to show that it divides either 2k−1 or 2l−1. From the given, we know that ∃c:dc=k. We need to show that ∃c′:(2d−1)c′=2k−1. Obviously c′=2c gives you (2d−1)c′=2k−2c, but I can't figure out how to get rid of the extra terms that the −1 brings in in various places.
From Euclid's lemma on the left side, you know ∃i:di=k−l, and applying it on the right side, you know it suffices to show that gcd(2k−2l,2l−1)=2d−1. And by Bezout's identity, it's enough to show that 2d−1 can be written as a linear combination of either of those things.
Can anyone give me a hint?
No comments:
Post a Comment