next up previous contents
Next: Lagrange interpolation polynomials Up: Canonical forms Previous: resolvent   Contents

Cayley-Hamilton theorem

In fact, $c_n=1$; the last equation of the series, asserts that $\chi(M) = O$, a proposition generally known as the Cayley-Hamilton TheoremCayley-Hamilton Theorem: a matrix satisfies its own characteristic equation.

If the factors of the characteristic polynomial are known, say by having evaluated the characteristic determinant and having found its roots, the matrix polynomial could be factored:

\begin{eqnarray*}
\chi(M) & = & (M - \lambda_1 I) (M - \lambda_2 I)
(M - \lambda_3 I) \ldots (M - \lambda_n I).
\end{eqnarray*}



Grouping all but one of the factors, and equating the result to zero,

\begin{eqnarray*}
(M - \lambda_i I) g_i(M) & = & 0 \\
M g_i(M) & = & \lambda_i g_i(M),
\end{eqnarray*}



results in something which could be called an eigenmatrix of $M$; in any event all its columns are eigenvectors for the eigenvalue $\lambda_i$, and of course, so are its rows.

If all the eigenvalues of $M$ were distinct, there would only be one eigenvector for each eigenvalue, aside from a scalar multiplier. Therefore, using Dirac's bra and ket notationbras and kets, and observing that all the rows of $g_i$ would be proportional, just as all the columns would have to manifest their dependence, one could deduce that $g_i$ was proportional to

\begin{displaymath}\frac{\vert\ i><i\ \vert}{<i\ \vert\ i>}. \end{displaymath}

If a multiplication table were made up for such column-by-row products, the convenient denominator and the orthogonality of left and right eigenvectors would result in the table

\begin{eqnarray*}
\frac{\vert i\ ><i\ \vert}{<i\ \vert\ i>}\frac{\vert j\ ><j\ ...
...} & = &
\delta_{ij} \frac{\vert\ i><i\ \vert}{<i\ \vert\ i>},
\end{eqnarray*}



containing orthogonal and idempotent matrices.


next up previous contents
Next: Lagrange interpolation polynomials Up: Canonical forms Previous: resolvent   Contents
Pedro Hernandez 2004-02-28