Is there a basis of the space of linear transformations from to itself made of only invertible elements?
Well yes. It is kind of strange though. Let’s think in matrix-land first and think of these as matrices since they are easier to conceptualize. Just take your basic Kronecker basis (the matrices containing 1 in exactly one position and 0 elsewhere) and add I, the identity matrix, to each.
To see that they span, take any matrix A. Consider the linear combination of the Kronecker basis you would use to get A. That linear combination with the new basis elements will sum to A plus some multiple of I. Since you can add the n diagonal matrices and then scale ot make I, you can subtract copies of I to get your desired matrix.
To see that they are linearly independent, you can either do it directly or just dimension count. I have n^2 vectors which span a dimension n^2 space, so I am done.
It is kind of weird that this is true (is this true for ?), but it is nice to have invertible matrices. I’m going to try to find problems where I can use this fact, since it is a pretty strong property to have “for free.”
Source: Linear Algebra Problem Book (Halmos). God I love Halmos.