Posted by: yanzhang | May 27, 2010

## On finding the ring of invariants

For graduate students, one of the easiest ways to procrastinate “doing math” is learning math (although Math Overflow is also way up there) . My recent tangent is low-brow invariant theory via Mara Neusel’s text. The book had a couple of recurring techniques that I liked because they used basic algebra in very dexterous ways. Here, I’ll outline one of these techniques and make some comments.

The main goal is finding the ring of invariants $F[V]^G$ for some representation of a group $G$ and the corresponding vector space $V$. We pick a basis $x_1, \ldots, x_n$ for $V$, so we can think of $F[V]$ as polynomials in $x_i$. To do this, guess (intelligently!) some $G$-invariant polynomials $f_1, \ldots, f_n$, and consider the algebra $A = F[f_1, \ldots, f_n]$. Now, check the following things:

1. the extension $A \rightarrow F[V]$ is integral;
2. $A$ is integrally closed;
3. $F(A) = F(V)^G$, where $F(A), F(V)$ are the field of fractions of $A, V$ respectively;

If so, then $A$ is the desired ring of invariants $F[V]^G$.

The reason this works is the following chain of thought: first, $F[V]^G$ satisfies the first two conditions: “1” involves staring at the characteristic polynomial $\prod_{g \in G} (x -gf)$ for any polynomial $f \in F[V]$ and looking at its coefficients (which are all $G$-invariant by symmetry) and  “2” is a just a routine check via the definitions. These conditions basically translate into the equivalences $A = F(A) \cap F[V]$ and $F[V]^G = F(F[V]^G) \cap F[V]$. It can then be proved (I was dumb and almost took this for granted, but luckily math wins so it was true) that $F(F[V]^G) = F(V)^G$. “3” then immediately shows that $A = F[V]^G$.

So how well does this recipe work? Qualitatively, the only metric I have of this property would be that the parts have to be individually easy to check. “1” is not that hard – we just need to construct polynomials for the basis elements $x_i$ and show the coefficients are in $A$, and looking at the characteristic polynomials suffice. “2” is harder. The main examples given in the book have $A$ be explicitly generated by algebraically independent things, so $A$ is clearly isomorphic to a polynomial ring in that case and thus integrally closed, but I don’t know if there is a deterministic way in general to check this property (Q: is there?). “3” is surprisingly not bad: note that since $F(A)$ is always a subfield of $F(V)^G$ by definition, it suffices to show that $|F(V) : F(A)| = |F(V) : F(V)^G| = |G|$, and there are many ways to do this computation in a down-to-earth manner.

As an exercise, I’ll try to do an example. Let’s consider the defining representation of the symmetric group $S_n$ generated by permutations on $x_1, \ldots, x_n$ as matrices of size $n$. Intuitively, the ring of invariants should be $A = F[e_1, \ldots, e_n]$, where $e_i$ is the $i$-th elementary symmetric polynomial. It is easy to check that all of these are invariant in our representation, so it suffices to go down the checklist:

1. the orbits of each $x_i$ are the same, they’re exactly all the $x_i$!  Thus, we can consider their shared characteristic polynomial $P(x) = \prod_t (x - x_t) = \sum e_{n-k} x^k$. These coefficients are in $A$ (in fact, they’re the actual generators), so the extension $A \rightarrow F[V]$ is integral.
2. the generators of $A$ are algebraically independent (this is a fundamental property of the theory of symmetric functions, although it might be fun to take the Jacobian as well. I’m being kind of lame since what we’re trying to prove here is also basically true by fundamental properties of the theory, but bear with me), so $A$ is a polynomial ring and thus integrally closed.
3. $|C(x_1, \ldots, x_n) : F(A)| = n!$. The easiest way I have thought of of showing this is induction. Here’s a sketch: look at $F(A)(x_n) : F(A)$. This extension has degree $n$ with the minimal polynomial $P(x)$ defined above. However, since $e_k = x_ne'_{k-1} + e'_k$, where $e'_k$ is the $k$-th elementary symmetric polynomial in the variables $x_1, x_2, \ldots x_{n-1}$, we can think of $F(A)(x_n)$ as $F(e'_1, \ldots, e'_{n-1})(x_n)$. Because of algebraic independence between $x_n$ and the $e'_k$, $|C(x_1, \ldots, x_n) : F(e'_1, \ldots, e'_{n-1})(x_n)|$ is really just $|C(x_1, \ldots, x_{n-1}) : F(e'_1, \ldots, e'_{n-1})| = (n-1)!$ by induction, so the degree of our original extension is $n!$.

I wondered how one would come up with such a design, where the main glue seems to just be elementary properties of integral extensions from the good old undergraduate algebra days. My naive guess would be a line of thinking that goes something like this: the first two properties on the list are fairly natural, and since we know the “right answer” (i.e. $F[V]^G$) needs to satisfy them, we might as well check them. The third item really involves the “trivial” notational observation that for an integrally closed ring $R$ to have an integral extension into $F[V]$, it is equivalent to say $F(R) \cap F[V] = F[R]$. I’m fairly annoyed because this is one of those things that sound completely obvious in retrospect, although I’d be hard pressed to motivate it myself (Q: how would one do this?). Considering there are many much more elaborate proof schemes in mathematics than this, there’s still a long way to go…

As usual, I have little experience here so any comments would be great.

-Yan