For a complex matrix \(\mathbf{M}\), its transpose \(\mathbf{M}^\mathrm{T}\) is the matrix formed by swapping its rows with its columns and its conjugate \(\mathbf{M}^\ast\) is the matrix formed by negating its imaginary part.
A few posts ago^{[1]} I suggested that the conjugate of the transpose, known as the adjoint \(\mathbf{M}^\mathrm{H}\), was sufficiently important to be given the overloaded arithmetic operator
Before we take a look at them, however, we'll need to review some properties of complex conjugates.
From this we can trivially see that the conjugate of the conjugate is the original complex number and, only slightly less trivially, that the sum of a complex number and its conjugate must be real and that the difference between them must be purely imaginary
Now the leading diagonal elements of a Hermitian matrix \(\mathbf{M}\), being those whose row indices equal their column indices, must be real since
This means that we can generalise the concept of definiteness to complex matrices. For example, if
for all complex vectors \(\mathbf{v}\), then \(\mathbf{M}\) is positive semidefinite. Just as is the case for real matrices, this implies that its eigenvalues are greater than or equal to zero, as proven in derivation 2.
It is also the case that
From this it follows that the product of a complex matrix and its adjoint must be positive semidefinite, since
Note that the terms inside the square roots must be real since \(\mathbf{M}\) is Hermitian and so the elements on its leading diagonal are real, and the products of complex numbers and their conjugates are also real. That they are strictly positive if \(\mathbf{M}\) is positive definite follows from the same argument that we used for real matrices, to which I again refer you for the details.
The implementation of the complex Cholesky decomposition is similarly almost identical to that of the real Cholesky decomposition,
Listing 1 gives its implementation,
This follows our usual scheme of defining a type string that is stored in the
The only significant difference between this and
The full set of constructors are given in listing 3.
Note that, much as was the case for real matrices, if the constructor is passed a matrix then if it is lower triangular it is assumed to be the \(\mathbf{L}\) matrix of the decomposition and otherwise the Cholesky decomposition is performed upon the average of it and its adjunct, which is guaranteed to be Hermitian.
Finally we have the arithmetic operations of the determinant, the inverse and the solution of simultaneous linear complex equations, achieved by division by a Cholesky decomposition, as shown by listing 4.
The implementation of the
Program 1 demonstrates that
Note that the elements on the leading diagonal of
Finally, program 2 demonstrates that
Now I hope that all of this has convinced you that Hermitian matrices are indeed the complex equivalents of real symmetric matrices because it's all the evidence that I'm going to present!
[2] What Are You Pointing At?, www.thusspakeak.com, 2015.
[3] Are You Definitely Positive, Mr Cholesky?, www.thusspakeak.com, 2015.
[4] You've Got Class, Mr Cholesky!, www.thusspakeak.com, 2015.
A few posts ago^{[1]} I suggested that the conjugate of the transpose, known as the adjoint \(\mathbf{M}^\mathrm{H}\), was sufficiently important to be given the overloaded arithmetic operator
ak.adjoint
. The reason for this is that many of the useful properties of the transpose of a real matrix are also true of the adjoint of a complex matrix, not least of which is that complex matrices that are equal to their adjoints, known as Hermitian matrices, behave in many ways like symmetric real matrices.Before we take a look at them, however, we'll need to review some properties of complex conjugates.
Properties Of Complex Conjugates
These are simple enough that they needn't be relegated to a sidebar, so let's begin by defining a pair of complex numbers and their conjugates with
\[
\begin{align*}
\mathbf{z}_0 &= x_0 + i y_0\\
\mathbf{z}_0^\ast &= x_0  i y_0\\
\\
\mathbf{z}_1 &= x_1 + i y_1\\
\mathbf{z}_1^\ast &= x_1  i y_1
\end{align*}
\]
where \(x_i\) and \(y_i\) are real numbers and \(i\) is the positive imaginary unit, or square root of minus one.From this we can trivially see that the conjugate of the conjugate is the original complex number and, only slightly less trivially, that the sum of a complex number and its conjugate must be real and that the difference between them must be purely imaginary
\[
\mathbf{z}_0 + \mathbf{z}_0^\ast
= \left(x_0 + i y_0\right) + \left(x_0  i y_0\right)
= \left(x_0 + x_0\right) + i\left(y_0  y_0\right)
= 2 \times x_0\\
\mathbf{z}_0  \mathbf{z}_0^\ast
= \left(x_0 + i y_0\right)  \left(x_0  i y_0\right)
= \left(x_0  x_0\right) + i\left(y_0 + y_0\right)
= 2 \times i y_0
\]
Next, the product of a complex number and its conjugate is equal to the square of its magnitude
\[
\begin{align*}
\mathbf{z}_0 \times \mathbf{z}_0^\ast
&= \left(x_0 + i y_0\right) \times \left(x_0  i y_0\right)\\
&= \left(x_0 \times x_0  i y_0 \times i y_0\right) + i\left(y_0 \times x_0  x_0 \times y_0\right)\\
&= x_0^2 + y_0^2\\
&= \left\mathbf{z}_0\right^2
\end{align*}
\]
Furthermore, the sum of the conjugates of the two complex numbers equals the conjugate of their sum
\[
\begin{align*}
\mathbf{z}_0^\ast + \mathbf{z}_1^\ast
&= \left(x_0  i y_0\right) + \left(x_1  i y_1\right)\\
&= \left(x_0 + x_1\right)  i\left(y_0 + y_1\right)\\
\\
\left(\mathbf{z}_0 + \mathbf{z}_1\right)^\ast
&= \left(\left(x_0 + i y_0\right) + \left(x_1 + i y_1\right)\right)^\ast\\
&= \left(\left(x_0 + x_1\right) + i\left(y_0 + y_1\right)\right)^\ast\\
&= \left(x_0 + x_1\right)  i\left(y_0 + y_1\right)
\end{align*}
\]
Finally, the product of the conjugates of the two complex numbers also equals the conjugate of their product
\[
\begin{align*}
\mathbf{z}_0^\ast \times \mathbf{z}_1^\ast
&= \left(x_0  i y_0\right) \times \left(x_1  i y_1\right)\\
&= \left(x_0 \times x_1 + i y_0 \times i y_1\right)  i\left(x_0 \times y_1 + y_0 \times x_1\right)\\
&= \left(x_0 \times x_1  y_0 \times y_1\right)  i\left(x_0 \times y_1 + y_0 \times x_1\right)\\
\\
\left(\mathbf{z}_0 \times \mathbf{z}_1\right)^\ast
&= \left(\left(x_0 + i y_0\right) \times \left(x_1 + i y_1\right)\right)^\ast\\
&= \left(\left(x_0 \times x_1 + i y_0 \times i y_1\right) + i\left(x_0 \times y_1 + y_0 \times x_1\right)\right)^\ast\\
&= \left(\left(x_0 \times x_1  y_0 \times y_1\right) + i\left(x_0 \times y_1 + y_0 \times x_1\right)\right)^\ast\\
&= \left(x_0 \times x_1  y_0 \times y_1\right)  i\left(x_0 \times y_1 + y_0 \times x_1\right)
\end{align*}
\]
That out of the way, we're ready to see how Hermitian matrices are similar to real symmetric matrices.
Properties Of Hermitian Matrices
Just as the product of a real matrix and its transpose is symmetric, so the product of a complex matrix and its adjoint is Hermitian. Following the rules of matrix multiplication, we have
\[
\begin{align*}
\left(\mathbf{M} \times \mathbf{M}^\mathrm{H}\right)_{ij}
&= \sum_k \mathbf{M}_{ik} \times \mathbf{M}^\mathrm{H}_{kj}
= \sum_k \mathbf{M}_{ik} \times \mathbf{M}_{jk}^\ast\\
\left(\mathbf{M} \times \mathbf{M}^\mathrm{H}\right)_{ji}
&= \sum_k \mathbf{M}_{jk} \times \mathbf{M}^\mathrm{H}_{ki}
= \sum_k \mathbf{M}_{ik}^\ast \times \mathbf{M}_{jk}
= \sum_k \left(\mathbf{M}_{ik} \times \mathbf{M}_{jk}^\ast\right)^\ast
= \left(\mathbf{M} \times \mathbf{M}^\mathrm{H}\right)^\ast_{ij}
\end{align*}
\]
where \(\sum\) is the summation sign, and so the transpose of the product equals its conjugate and it is, by definition, a Hermitian matrix.Now the leading diagonal elements of a Hermitian matrix \(\mathbf{M}\), being those whose row indices equal their column indices, must be real since
\[
\mathbf{M}_{ii} = \mathbf{M}^\ast_{ii}
\]
can only be true if their imaginary parts are equal to zero. Furthermore for any complex vector \(\mathbf{v}\) and its conjugate \(\mathbf{v}^\ast\) the product
\[
\mathbf{v} \times \mathbf{M} \times \mathbf{v}^\ast
\]
must be a real number, as proven in derivation 1.
Derivation 1: v × M × v^{*} For Hermitian M
Combining the rules of matrixvector and vectorvector multiplication, we have
\[
\mathbf{v} \times \mathbf{M} \times \mathbf{v}^\ast = \sum_i \sum_j \mathbf{v}_i \times \mathbf{M}_{ij} \times \mathbf{v}^\ast_j
\]
Since a Hermitian matrix is equal to the conjugate of its transpose, we also have
\[
\mathbf{M}_{ij} = \mathbf{M}^\ast_{ji}
\]
If we split this sum into three parts, \(j\) less than \(i\), \(j\) equal to \(i\) and \(j\) greater than \(i\), then by the properties of complex conjugates we have
\[
\begin{align*}
\sum_i \sum_{j<i} \mathbf{v}_i \times \mathbf{M}_{ij} \times \mathbf{v}^\ast_j
&= \sum_i \sum_{j<i} \left(\mathbf{v}_i \times \mathbf{M}_{ij}\right) \times \mathbf{v}^\ast_j\\
\sum_i \sum_{j=i} \mathbf{v}_i \times \mathbf{M}_{ij} \times \mathbf{v}^\ast_j
&= \sum_i \mathbf{v}_i \times \mathbf{M}_{ii} \times \mathbf{v}^\ast_i
= \sum_i \mathbf{M}_{ii} \times \mathbf{v}_i \times \mathbf{v}^\ast_i
= \sum_i \mathbf{M}_{ii} \times \left\mathbf{v}_i\right^2\\
\sum_i \sum_{j>i} \mathbf{v}_i \times \mathbf{M}_{ij} \times \mathbf{v}^\ast_j
&= \sum_i \sum_{j>i} \mathbf{v}_i \times \left(\mathbf{M}^\ast_{ji} \times \mathbf{v}^\ast_j\right)
= \sum_j \sum_{i<j} \mathbf{v}_i \times \left(\mathbf{M}^\ast_{ji} \times \mathbf{v}^\ast_j\right)\\
&= \sum_j \sum_{i<j} \left(\mathbf{v}^\ast_j \times \mathbf{M}^\ast_{ji}\right) \times \mathbf{v}_i
= \sum_j \sum_{i<j} \left(\mathbf{v}_j \times \mathbf{M}_{ji}\right)^\ast \times \mathbf{v}_i\\
&= \sum_i \sum_{j<i} \left(\mathbf{v}_i \times \mathbf{M}_{ij}\right)^\ast \times \mathbf{v}_j
= \sum_i \sum_{j<i} \left(\left(\mathbf{v}_i \times \mathbf{M}_{ij}\right) \times \mathbf{v}^\ast_j\right)^\ast
\end{align*}
\]
Since \(\mathbf{M}_{ii}\) equals \(\mathbf{M}^\ast_{ii}\) it must be real, as is the magnitude of \(\mathbf{v}_i\), and so the second part is real. Furthermore, the third part is the conjugate of the first part and so their sum must also be real.
This means that we can generalise the concept of definiteness to complex matrices. For example, if
\[
\mathbf{v} \times \mathbf{M} \times \mathbf{v}^\ast \geqslant 0
\]
Derivation 2: The Eigenvalues Of M
First, let us assume that \(\mathbf{v}\) is a unit magnitude eigenvector of \(\mathbf{M}\) with an eigenvalue of \(\lambda\)
\[
\begin{align*}
\mathbf{v} &= 1\\
\mathbf{M} \times \mathbf{v} &= \lambda \times \mathbf{v}
\end{align*}
\]
Since every vector has a conjugate, if \(\mathrm{M}\) is positive semidefinite then both
\[
\mathbf{v} \times \mathbf{M} \times \mathbf{v}^\ast \geqslant 0
\]
and
\[
\mathbf{v}^\ast \times \mathbf{M} \times \mathbf{v} \geqslant 0
\]
must be true and from the second we have
\[
\begin{align*}
\mathbf{v}^\ast \times \mathbf{M} \times \mathbf{v}
&= \mathbf{v}^\ast \times (\mathbf{M} \times \mathbf{v})\\
&= \mathbf{v}^\ast \times (\lambda \times \mathbf{v})\\
&= \lambda \times \left(\mathbf{v}^\ast \times \mathbf{v}\right)\\
&= \lambda \times \mathbf{v}^2\\
&= \lambda
\end{align*}
\]
and so \(\lambda\) must be greater than or equal to zero.
It is also the case that
\[
\left(\mathbf{v} \times \mathbf{M}\right)^\ast = \mathbf{M}^\mathrm{H} \times \mathbf{v}^\ast
\]
since, by the rules of matrixvector multiplication and the products and sums of complex conjugates
\[
\begin{align*}
\left(\mathbf{v} \times \mathbf{M}\right)^\ast_j &= \left(\sum_i \mathbf{v}_i \times \mathbf{M}_{ij}\right)^\ast\\
&= \sum_i \mathbf{v}^\ast_i \times \mathbf{M}_{ij}^\ast
\end{align*}
\]
and
\[
\begin{align*}
\left(\mathbf{M}^\mathrm{H} \times \mathbf{v}^\ast\right)_j &= \sum_i \mathbf{M}^\mathrm{H}_{ji} \times \mathbf{v}^\ast_i\\
&= \sum_i \mathbf{v}^\ast_i \times \mathbf{M}^\ast_{ij}
\end{align*}
\]
Note that this is the complex equivalent of the relationship
\[
\mathbf{v} \times \mathbf{M} = \mathbf{M}^\mathrm{T} \times \mathbf{v}
\]
for real matrices.From this it follows that the product of a complex matrix and its adjoint must be positive semidefinite, since
\[
\begin{align*}
\mathbf{v} \times \left(\mathbf{M} \times \mathbf{M}^\mathrm{H}\right) \times \mathbf{v}^\ast
&= \left(\mathbf{v} \times \mathbf{M}\right) \times \left(\mathbf{M}^\mathrm{H} \times \mathbf{v}^\ast\right)\\
&= \left(\mathbf{v} \times \mathbf{M}\right) \times \left(\mathbf{v} \times \mathbf{M}\right)^\ast\\
&= \left\mathbf{v} \times \mathbf{M}\right^2 \geqslant 0
\end{align*}
\]
A Complex Cholesky Decomposition
Another property that Hermitian complex matrices have in common with symmetric real matrices is that, if they are positive definite satisfying
\[
\mathbf{v} \times \mathbf{M} \times \mathbf{v}^\ast > 0
\]
then we can uniquely find a lower triangular matrix \(\mathbf{L}\) such that
\[
\mathbf{L} \times \mathbf{L}^\mathrm{H} = \mathbf{M}
\]
This is the complex Cholesky decomposition and can found with the rules
\[
\begin{align*}
\mathbf{L}_{00} &= \sqrt{\mathbf{M}_{00}}\\
\mathbf{L}_{i0} &= \frac{\mathbf{M}_{i0}}{\mathbf{L}_{00}} && \mathrm{for}\;i>0\\
\mathbf{L}_{jj} &= \sqrt{\mathbf{M}_{jj}  \sum_{k=0}^{j1} \mathbf{L}_{jk} \times \mathbf{L}^\ast_{jk}} && \mathrm{for}\;j>0\\
\mathbf{L}_{ij} &= \frac{\mathbf{M}_{ij}  \sum_{k=0}^{j1} \mathbf{L}_{ik} \times \mathbf{L}^\ast_{jk}}{\mathbf{L}_{jj}} && \mathrm{for}\;j>0, i>j\\
\mathbf{L}_{ij} &= 0 && \mathrm{for}\;j>0, i<j
\end{align*}
\]
The proof of this is almost exactly the same as that for the Cholesky decomposition for real matrices, to which I refer you for the details^{[3]}.Note that the terms inside the square roots must be real since \(\mathbf{M}\) is Hermitian and so the elements on its leading diagonal are real, and the products of complex numbers and their conjugates are also real. That they are strictly positive if \(\mathbf{M}\) is positive definite follows from the same argument that we used for real matrices, to which I again refer you for the details.
The implementation of the complex Cholesky decomposition is similarly almost identical to that of the real Cholesky decomposition,
ak.choleskyDecomposition
and I shall similarly refer you back to the latter^{[4]} for most of the details of the former.Listing 1 gives its implementation,
ak.complexCholeskyDecomposition
Listing 1: ak.complexCholeskyDecomposition
ak.COMPLEX_CHOLESKY_DECOMPOSITION_T = 'ak.complexCholeskyDecomposition'; function ComplexCholeskyDecomposition(){} ComplexCholeskyDecomposition.prototype = { TYPE: ak.COMPLEX_CHOLESKY_DECOMPOSITION_T, valueOf: function(){return ak.NaN;} }; ak.complexCholeskyDecomposition = function(arg) { var cd = new ComplexCholeskyDecomposition(); var state = {l:undefined}; constructors[ak.type(arg)](state, arg); cd.l = function() {return state.l;}; cd.toMatrix = function() {return toMatrix(state.l);}; cd.toString = function() {return '{l:'+state.l.toString()+'}';}; cd.toExponential = function(d) {return '{l:'+state.l.toExponential(d)+'}';}; cd.toFixed = function(d) {return '{l:'+state.l.toFixed(d)+'}';}; cd.toPrecision = function(d) {return '{l:'+state.l.toPrecision(d)+'}';}; return Object.freeze(cd); }; var constructors = {};
This follows our usual scheme of defining a type string that is stored in the
TYPE
member and will be used to dispatch to appropriate arithmetic operations for ak.complexCholeskyDecomposition
objects, of dispatching to a constructors
object for initialisation and providing property access and conversion methods.The only significant difference between this and
ak.choleskyDecomposition
is that the l
member of the internal state
object is an ak.complexMatrix
and so we must use ak
arithmetic operations when initialising and recovering the matrix, as illustrated by the toMatrix
helper function given in listing 2.Listing 2: The toMatrix Helper Function
function toMatrix(l) { var n = l.rows(); var m = new Array(n); var i, j, k, s; for(i=0;i<n;++i) m[i] = new Array(n); for(i=0;i<n;++i) { for(j=0;j<=i;++j) { s = 0; for(k=0;k<=j;++k) s = ak.add(s, ak.mul(l.at(i, k), ak.conj(l.at(j, k)))); m[i][j] = s; m[j][i] = ak.conj(s); } } return ak.complexMatrix(m); }
The full set of constructors are given in listing 3.
Listing 3: The ak.complexCholeskyDecomposition Constructors
constructors[ak.COMPLEX_MATRIX_T]; constructors[ak.MATRIX_T]; constructors[ak.OBJECT_T]; constructors[ak.CHOLESKY_DECOMPOSITION_T]; constructors[ak.COMPLEX_CHOLESKY_DECOMPOSITION_T];
Note that, much as was the case for real matrices, if the constructor is passed a matrix then if it is lower triangular it is assumed to be the \(\mathbf{L}\) matrix of the decomposition and otherwise the Cholesky decomposition is performed upon the average of it and its adjunct, which is guaranteed to be Hermitian.
Finally we have the arithmetic operations of the determinant, the inverse and the solution of simultaneous linear complex equations, achieved by division by a Cholesky decomposition, as shown by listing 4.
Listing 4: ak.complexCholeskyDecomposition Overloads
ak.overload(ak.det, ak.COMPLEX_CHOLESKY_DECOMPOSITION_T, det); ak.overload(ak.inv, ak.COMPLEX_CHOLESKY_DECOMPOSITION_T, inv); ak.overload(ak.div, [ak.COMPLEX_MATRIX_T, ak.COMPLEX_CHOLESKY_DECOMPOSITION_T], divWD); ak.overload(ak.div, [ak.COMPLEX_T, ak.COMPLEX_CHOLESKY_DECOMPOSITION_T], divCD); ak.overload(ak.div, [ak.COMPLEX_VECTOR_T, ak.COMPLEX_CHOLESKY_DECOMPOSITION_T], divZD); ak.overload(ak.div, [ak.MATRIX_T, ak.COMPLEX_CHOLESKY_DECOMPOSITION_T], divWD); ak.overload(ak.div, [ak.NUMBER_T, ak.COMPLEX_CHOLESKY_DECOMPOSITION_T], divCD); ak.overload(ak.div, [ak.VECTOR_T, ak.COMPLEX_CHOLESKY_DECOMPOSITION_T], divZD);
The implementation of the
ak.complexCholeskyDecomposition
constructors and its overloads can be found in ComplexCholeskyDecomposition.js, the details of which I shall, as previously stated, refer you to those of the ak.choleskyDecomposition
class.Program 1 demonstrates that
ak.complexCholeskyDecomposition
correctly recovers a lower triangular complex matrix from its product with its adjoint, as well as that product.
Program 1: Recovering L And M



Note that the elements on the leading diagonal of
l
must be real and positive for it to be recovered since the leading diagonal elements of the lower triangular matrix of the complex Cholesky decomposition are, by definition, real and positive. However, m
will be recovered whenever they are nonzero, which you might test by changing them to negative or nonzero complex values.Finally, program 2 demonstrates that
ak.inv
correctly recovers the inverse of a complex positive definite matrix.
Program 2: Computing The Inverse



Now I hope that all of this has convinced you that Hermitian matrices are indeed the complex equivalents of real symmetric matrices because it's all the evidence that I'm going to present!
References
[1] The Matrix Isn't Real, www.thusspakeak.com, 2015.[2] What Are You Pointing At?, www.thusspakeak.com, 2015.
[3] Are You Definitely Positive, Mr Cholesky?, www.thusspakeak.com, 2015.
[4] You've Got Class, Mr Cholesky!, www.thusspakeak.com, 2015.
Leave a comment