In my increasing old age, I’ve glumly realized that my memory capacities lie somewhere 3 standard deviations down in the murky bottom of the gene pool. This spells disaster for my life ahead as a mathematician, but I struggle to compensate by remembering as few things as possible by cutting out expendable chunks such as needing to eat, birthdays (others and my own), and multiple instances of the same mathematical theorem. This is why I was really happy and angry when I realized that three theorems I have come to love are really the same thing. Happy because it means there are two fewer things to remember, angry because I really should have known this sooner (and I’m sure many of you have already taken this for granted). Cest la vie.
I. Hall’s Marriage Theorem: given a bipartite graph with parts and , there exists a matching of into if and only if for any subset , the corresponding subset of vertices in with at least one edge connected to some vertex in has cardinality at least that of .
II. A ubiquitous nameless lemma about matrices that I have seen in at least 3 places: given such a matrix, the minimal number of rows and/or columns to cover all the s is the same as the maximum number of s we can find such that no two are in the same row or column.
III. Konig’s Theorem: in a bipartite graph, the number of edges in a maximum matching equals the number of vertices in a minimum vertex cover.
Since they don’t always come up in the same context, I’ve always treated them as different chunks of information to package into my brain. Now that they’ve been juxtoposed I guess it should have been more obvious, but it took this old geezer a while to realize that yes, they are all the same (thanks to the principle of max-cut and min-flow).
For (I), this is classical. The main upshot is that the maximal matching is the same as a maximum flow, when you add a source that goes to all the vertices of and a sink that comes from all the vertices of . This is a favorite space-filler of college courses, so I’ll omit the rest.
For (III), I randomly realized you can do the same here: add a source and a sink as before. Then the matching is the flow and a cover is just a cut (since we need all of the edges to be cut).
For (II), the subtlety is only a slight deterrant. This was the main cute idea that I realized while cooking breakfast, inexplicably and unfortunately unduplicably inspired by oatmeal and camomile tea: make a bipartite graph with parts and out of the matrix, where the rows correspond to the vertices in A and the columns correspond to the vertices in B, let there be an edge when the corresponding matrix entry has a 0 and no edge otherwise. Then a matching is just a set of s with no two in the same row or column, and a vertex cover is just a set of rows and columns (a row corresponds to a vertex cover of the corresponding vertex in A, and similarly for columns and B). There.
P.S. Of course I was speaking in jest about needing to forget – even if statements are equally strong, there are definitely situations that call for one instead of another. Look at the Axiom of Choice – or my less-cliched favorite example of equivalent things that are obviously non-equivalent: the “weak” Nullstellensatz and the “real” Nullstellensatz. I’m just old and grumbling.
P.P.S. Yes, I know there is also Dilworth’s theorem. However, I feel the more “immediate” links between these is max-flow / min-cut.
P.P.P.S. I’m not actually that old.