Posted by: yanzhang | May 13, 2008

## An Invertible Basis

Is there a basis of the space of linear transformations from $R^n$ to itself made of only invertible elements?

Proof:

Well yes. It is kind of strange though. Let’s think in matrix-land first and think of these as matrices since they are easier to conceptualize. Just take your basic Kronecker basis (the matrices containing 1 in exactly one position and 0 elsewhere) and add I, the identity matrix, to each.

To see that they span, take any matrix A. Consider the linear combination of the Kronecker basis you would use to get A. That linear combination with the new basis elements will sum to A plus some multiple of I. Since you can add the n diagonal matrices and then scale ot make I, you can subtract copies of I to get your desired matrix.

To see that they are linearly independent, you can either do it directly or just dimension count. I have n^2 vectors which span a dimension n^2 space, so I am done.

It is kind of weird that this is true (is this true for $Z^n$?), but it is nice to have invertible matrices. I’m going to try to find problems where I can use this fact, since it is a pretty strong property to have “for free.”

-Y

Source: Linear Algebra Problem Book (Halmos). God I love Halmos.

1. The result holds for $\mathbf{Z}^n$ as well. By the above argument it suffices to show that there exists an invertible (over Z) integral matrix for any fixed choice of diagonal. This is easy to show by breaking the matrix up into 2×2 blocks along the diagonal.
2. You can think about it this way: The space of matrices is just like $\mathbb R^{n^2}$, and the invertible matrices are an open set, so in particular they contain a ball, which means there are vectors in all the independent directions, therefore you have a basis (as matter of fact, I think that very argument works for integer matrices, because if a matrix $A$ is invertible, so is any $B$ such that $|A - B| < \frac{1}{|A^{-1}|})$).