Reducibility
Glossary Contents
← Ch III
Chapter IV Set Packing
Ch V →
  1. 10 — Disjoint or bust
  2. 11 — Branch-and-bound in Python
  3. 12 — Crew pairings
Chapter IV

Set Packing

Pick as many sets as you can without two of them sharing an element. The dual of set cover, and the problem behind every "you can't double-book this resource" rule in the world.

10

Disjoint or bust

If two sets share even one element, you can only pick one.

Set packing takes a universe U and a family F of subsets of U, and asks for the largest subfamily of pairwise-disjoint sets:

max∣F′∣subject toF′⊆F,  S∩T=∅  for all S=T∈F′.(1)

Karp reduced clique to (1) — and in fact the two are interreducible. Disjointness in the set system corresponds to non-adjacency, then to adjacency in the complement graph, and a packing is a clique.

The optimization version is hard to approximate within a factor of

n1−ε,(2)

making this one of the genuinely brutal members of the 21: the gap (2) means even getting close to optimal is essentially as hard as solving exactly.

Figures, in plain terms
(1)

max is the optimization keyword: pick whatever maximizes the expression after it. |F'| with the vertical bars on either side is the cardinality — the count of elements in F'.

subject to is the standard separator between an objective and its constraints. Everything after it must hold; the max is taken over choices that satisfy them.

F' ⊆ F says F' is some subfamily of the original family F (the subset symbol again).

S ∩ T = ∅ uses the intersection symbol ∩ (the cap; "elements in both") and ∅ (the empty set, with a slash through the O). So S ∩ T = ∅ means S and T share no elements at all.

The clause for all S ≠ T ∈ F' quantifies over every pair of distinct sets in our chosen subfamily.

Reassembled: pick the largest subfamily where no two chosen sets share any element.

(2)

n^(1−ε) is the inapproximability factor. The exponent 1 − ε is "just barely below 1," where ε is a tiny positive number.

For n = 1000 and ε = 0.01, n^(1−ε) ≈ 955 — almost as big as n itself. So no polynomial-time algorithm can guarantee getting within a factor of even ~1000 of optimal on a 1000-element instance.

This is dramatically worse than set cover's ln(|U|) gap: the problem is brutally hard in both the exact and approximation senses.

In plain terms

You have a pile of bags, each holding a few items. You want to grab as many bags as you can such that no two bags ever held the same item. Sharing means rivalry — you have to drop one.

Scan to come back to page 10

← 09 Money laundering rings
1/3 · 10/63
11 Branch-and-bound in Python →