Greatest common divisor (Highest common factor)

You may know this name already; if not, you can easily guess what it is about. Suppose you have two natural numbers. Each has its own divisors, and chances are they have some divisors in common. The largest among the common divisors is called the greatest common divisor (GCD), rightfully.

Definition: The greatest common divisor of the natural numbers b and c is the natural number g such that g|b and g|c (a common divisor), and if d|b and d|c then d <= g (the greatest among all common divisors). We can use the notation gcd(b,c).

For example, the divisors of 20 are 1, 2, 4, 5, 10, and 20; those of 12 are 1, 2, 3, 4, 6, and 12. So the common divisors of 20 and 12 are 1, 2, and 4, among which 4 is the largest, and therefore the GCD of 20 and 12 is 4.

As another example, the divisors of 10 are 1, 2, 5, and 10; those of 27 are 1, 3, 9, 27. So 1 is the only common divisor, and it becomes the GCD of 10 and 27.

This latter example illustrates a fundamental point: since 1 is a universal divisor, it is always a common divisor of everybody, and thus common divisors always exist. The more interesting question is who else are there, and who is the greatest. The GCD of 10 and 27 is 1 because it is the only choice, while the GCD of 20 and 12 is 4 because 1 is not the only choice.

The former example (of 20 and 12) illustrates another point. Observe that the GCD, 4, is a multiple of all the other common divisors. This is in fact universally true. All common divisors must divide the GCD, and the GCD is always a multiple of the common divisors.

To prove that this is universally true, first we need the following lemma. It talks about a collect of numbers of the form bi+cj, i.e., you vary i and j over the integers and collect the results of bi+cj. We call them ``linear combinations'' of b and c. Observe that bi+cj is always a multiple of common divisors of b and c: if d|b and d|c, then d | bi+cj. Moreover, today we are only interested in the positive numbers among bi+cj.

Lemma: The minimum m of the set { bi+cj : i and j integers, and bi+cj > 0 } is gcd(b,c).

Proof. We first show that m is a common divisor of b and c. Suppose m does not divide b, then by the division theorem, b = qm + r with 0 < r < m. But then r = b - qm = b - q(bx + cy) = b(1 - qx) + c(-qy), contradiction. So m divides b. Similarly, m divides c. So m <= gcd(b,c). And gcd(b,c) divides m so gcd(b,c) <= m.

Theorem: The natural number g is gcd(b,c) if and only if g is a common divisor of b and c, and g is a multiple of all common divisors of b and c: if d|b and d|c then d|g.

Proof. If g is a common divisor of b and c, and g is a multiple of all common divisors of b and c, then g is the greatest common divisor. Conversely, if g=gcd(b,c), then g is a member of the set in the above lemma, so g is not only a common divisor but also a multiple of all common divisors.

It turns out that this characterization of GCDs is more useful than the original definition. Therefore we can define GCD alternatively:

Definition: The greatest common divisor of the natural numbers b and c is the natural number g such that g|b and g|c (a common divisor), and if d|b and d|c then d|g (the greatest).

This definition extends well to exotic number systems such as the Gaussian integers, although in those cases the GCDs may not be unique, so we may need to change ``the'' to ``a'' when we generalize the above definition.