Before leaving the subject of mappings, recall that Linear(Space, Space) is a vector space itself, fo it ought to have a basis, inner products, a bilinear functional, a dual, and so on. A FourierFourier pair Pair has already been exhibited as such a basis, at least for any matrix with a complete set of eigenvectors. The basis which is most directly connected with Linear(Space, Space) is the collection of matrices , all of whose matrix elements are zero except for the one at the intersection of the
row and
column. The rule of multiplication is
The expansion of a matrix in this basis is just
Why isn't the determinant of a matrix product a candidate for an inner product? Because it is not bilinear. Are other inner products feasible? Consider for a positive definite matrix
, but that is just like introducing a metric matrix into the ordinary inner product for vectors.
How do we get a basis for matrices compatible with the eigenvector basis for vectors? Consider the column-by=row products (column i)(row j); their multiplication table is similar to the multiplication table for standard vectors; the trace relationships are also verifiable.
How do we get a basis for matrices consisting of monomials when
and
don't commute and so
isn't
and
isn't
, and so on? Evidently it is sufficient to know how to rewrite
, even though the calculation could be tedious. The best thing is to let the eigenvectors speak for themselves, which they do through the diagonalizing matrices and the column by row idempotents.
Suppose that has a family of idempotents
which are column by row products of eigenvectors, and that
has a similar family
. That leaves us with four bases which can be collected into matrices:
whose columns are the eigenvectors of
,
whose rows are also eigenvectors of
; similarly
and
for
. All the column eigenvectors of
are linear combinations of column eigenvectors of
since
stands to the right since the matrix product wants to combine columns of
to get
). That means that the products
form a basis for square matrices, since they are just multiples of a column eigenvector of
by a row eigenvector of
, and columns by row already created bases, both for
and for
, just as they do in the standard basis. Also,
, making a particular row eigenvector of
satisfy
.
The point of this exercise is that and
, as well as any other matrix, can be written in this mixed basis; in that form their powers and products can then be calculated in terms of the matrix
and its relatives.
If we truly belive everything we have said, then we ought to find