Number Theory, show gcd(a+b, a^2-ab+b^2)=1 or 3?

Given gcd(a,b)=1,

Show that gcd(a+b, a^2-ab+b^2)=1 or 3.

I've gotten it to the point of changing

a^2-ab+b^2 to (a+b)^2-3ab

and from there I should just have to work with

gcd(a+b, 3ab)

but I still can't show the gcd = 1 or 3, can someone point me in the right direction?

2 Answers

Relevance
  • 1 decade ago
    Favorite Answer

    If gcd(a+b, a^2-ab+b^2) = d

    then d divides both of them

    Since a^2-ab+b^2 = (a+b)^2-3ab, it results that

    d divides 3ab,

    Now assume that d has itself a prime p as divisor, in other words in the decomposition of d, there is a prime p

    So p divides 3ab

    Now that implies that p divides 3 or a or b

    If p divides a or b, since p divides a+b then p divides both of them, contradiction with gcd(a,b) = 1

    So either d=1 or p=3 So d =3^k.

    It remains the case k>1, using the same methods you rule out this case too

  • 1 decade ago

    a and b are relatively prime - that is, have no common factors except 1 - so gcd(a, b) = 1. Thus, if a prime divides a, then it does not divide b, and vice versa, So if a prime divides a + b, it can not divide either a or b; otherwise, it would divide, say a and a + b, and must then divide a + b - a = b. If a prime divides either ab, it must divide either a or b. So no prime divides both a + b and ab. Now, if 3 divides a + b, then gcd(a + b, 3ab) = 3, whereas if 3 does not divide a + b, then gcd(a + b, 3ab) = 1.

Still have questions? Get your answers by asking now.