Some years ago we saw how we could use the Jacobi algorithm to find the eigensystem of a real valued symmetric matrix M, which is defined as the set of pairs of non-zero vectors vi and scalars λi that satisfy
M × vi = λi × vi
known as the eigenvectors and the eigenvalues respectively, with the vectors typically restricted to those of unit length in which case we can define its spectral decomposition as the product
M = V × Λ × VT
where the columns of V are the unit eigenvectors, Λ is a diagonal matrix whoseith diagonal element is the eigenvalue associated with the ith column of V and the T superscript denotes the transpose, in which the rows and columns of the matrix are swapped.
You may recall that this is a particularly convenient representation of the matrix since we can use it to generalise any scalar function to it with
f(M) = V × f(Λ) × VT
wheref(Λ) is the diagonal matrix whose ith diagonal element is the result of applying f to the ith diagonal element of Λ.
You may also recall that I suggested that there's a more efficient way to find eigensystems and I think that it's high time that we took a look at it.
known as the eigenvectors and the eigenvalues respectively, with the vectors typically restricted to those of unit length in which case we can define its spectral decomposition as the product
where the columns of V are the unit eigenvectors, Λ is a diagonal matrix whose
You may recall that this is a particularly convenient representation of the matrix since we can use it to generalise any scalar function to it with
where
You may also recall that I suggested that there's a more efficient way to find eigensystems and I think that it's high time that we took a look at it.
Full text...