Complexification, complex structures, and linear ordinary differential equations
1 Motivation
The solution of the initial value problem
where is an matrix over , is . If we want to compute the solution and if is diagonalizable, say , we use
Thus if the matrix has complex eigenvalues, then although , it may not be the case that . For example, if , then
For ,
This is similar to how Cardano’s formula, which expresses the roots of a real cubic polynomial in terms of its coefficients, involves complex numbers and yet the final result may still be real.
In the following, unless I specify the dimension of a vector space, any statement about real vector spaces is about real vector spaces of finite or infinite dimension, and any statement about complex vector spaces is about complex vector spaces of finite or infinite dimension.
2 Direct sums
If is a real vector space, a complex structure for is an -linear map such that .
If is a real vector space and is a complex structure, define a complex vector space in the following way: let the set of elements of be , let addition in be addition in , and define scalar multiplication in by
One checks that for and we have , and thus that is indeed a complex vector space with this definition of scalar multiplication.11 1 One should also verify that distributivity holds with this definition of scalar product; the other properties of a vector space are satisfied because has the same addition as the real vector space .
Let be a real vector space, and define the -linear map by
. is a complex structure on the real vector space . The complexification of is the complex vector space . Thus, has the same set of elements as , the same addition as , and scalar multiplication
which gives
If the real vector space has dimension and if is a basis for , then
is a basis for the real vector space . Let . Using the basis for the real vector space , there exist
such that
where in the last line we used the definition of scalar multiplication in . One checks that the set is linearly independent over , and therefore it is a basis for . Hence
3 Complexification is a functor
If are real vector spaces and is an -linear map, we define
by
this is a -linear map. Setting and , is the unique -linear map such that .22 2 See Keith Conrad’s, https://kconrad.math.uconn.edu/blurbs/linmultialg/complexification.pdf
Complexification is a functor from the category of real vector spaces to the category of complex vector spaces:
so , and if and are -linear maps, then
so .
4 Complexifying a complex structure
If is a real vector space and is a complex structure, then
so . Let
If , then one checks that
and
It follows that
5 Complex structures, inner products, and symplectic forms
If is a real vector space of odd dimension, then one can show that there is no linear map satisfying , i.e. there does not exist a complex structure for it. On the other hand, if has even dimension, let
be a basis for the real vector space , and define by
Then is a complex structure.
If is a real vector space of dimension with a complex structure , let . Check that . If , let
Check that the set is linearly independent. If , let
Check that the set is linearly independent. If then there is some
I assert that
is a basis for .
Using the above basis for , let , and define by
Check that this is an inner product on the real vector space . Moreover,
and
and
and
Hence for any ,
We say that the complex structure is compatible with the inner product , i.e. is an orthogonal transformation.
A symplectic form on a real vector space is a bilinear form such that , and such that if for all then ; we say respectively that is skew-symmetric and non-degenerate. If a real vector space has a complex structure , and is an inner product on that is compatible with , define by
which is equivalent to
Using that the inner product is compatible with and that it is symmetric,
so is skew-symmetric. If and for all , then
for all , and thus . Since is invertible, . Thus is nondegenerate. Therefore is a symplectic form on .33 3 Using the basis for , , we have and A basis for a symplectic vector space that satisfies these three conditions is called a Darboux basis. We have
We say that is compatible with the sympletic form , namely, is a symplectic transformation.
On the other hand, if is a real vector space with symplectic form and is a compatible complex structure, then defined by
is an inner product on that is compatible with the complex structure .
Suppose is a real vector space with complex structure and that is an inner product on the complex vector space . Define by44 4 The letter refers to a Hermitian form, i.e. an inner product on a complex vector space, and the letter refers to the usual notation for a metric on a Riemannian manifold.
It is straightforward to check that is an inner product on the real vector space . Similarly, define by
It is apparent that is skew-symmetric. If for all , then in particular , and so
As is a complex inner product,
i.e.
and thus , which implies that . Therefore is nondegenerate, and thus is a symplectic form on the real vector space . With these definitions of and , for we have
which writes the inner product on the complex vector space using an inner product on the real vector space and a symplectic form on the real vector space ; note that has the same set of elements as . Moreover, for we have
5.1 Tensor products
Here we give another presentation of the complexification of a real vector space, this time using tensor products of real vector spaces. If you were satisfied by the first definition you don’t need to read this one; read this either if you are curious about another way to define complexification, if you want to see a pleasant application of tensor products, or if you didn’t like the first definition. Let be a real vector space of dimension . is a real vector space of dimension , and
is a real vector space of dimension . If has basis , then has basis . Since every element of can be written uniquely in the form
one often writes
here is a real vector space that is isomorphic to .
The complexification of is the complex vector space whose set of elements is , with the same addition as the real vector space , and with scalar multiplication defined by
Let . Using the basis of the real vector space , there exist some
such that
where in the last line we used the definition of scalar multiplication in . One checks that the is linearly independent over , and hence that it is a basis for the complex vector space , so has dimension over .
If and are real vector spaces and is a linear map, define by
With this definition of , one can check that complexification is a functor from the category of real vector spaces to the category of complex vector spaces.
6 Decomplexification
If is a complex vector space, let be the real vector space whose set of elements is , in which addition is the same as addition in , and in which scalar multiplication is defined by
Because is a complex vector space, it is apparent that is a real vector space with this scalar multiplication. We call the decomplexification of the complex vector space .
If has basis and , then there are such that
One checks that
are linearly independent over , and hence are a basis for the real vector space . Thus,
If is a complex vector space and is a -linear map, define by
Because is -linear it follows that is -linear. Decomplexification is a functor from the category of complex vector spaces to the category of real vector spaces. Since decomplexification is defined simply by ignoring the fact that is closed under multiplication by complex scalars and only using real scalars, decomplexification is called a forgetful functor
7 Complex conjugation in complexified vector spaces
If is a real vector space, define by
We call complex conjugation in . We have . If is a -linear map, define by
is a -linear map. It is a fact that if is -linear, then if and only if there is some -linear such that . In words, a linear map on the complexification of a real vector space is equal to its own conjugate if and only if it is the complexification of a linear map on the real vector space.
The following are true statements:55 5 These are exercises from V. I. Arnold’s Ordinary differential equations, p. 122, §18.4, in Richard A. Silverman’s translation. ()
-
•
If is a linear map, then
and
-
•
If is a linear map, then
-
•
If is a linear map, then
and
-
•
If is a linear map, then
-
•
If is a linear map, then
and
-
•
If is a linear map, then
8 Linear ordinary differential equations over
Let be an matrix over . The solution of the initial value problem
is .
If has distinct eigenvalues , then, with
we have
where each has dimension 1. For ,
Let be nonzero, . They are a basis for , so there are such that
Then
Suppose that is an matrix over , that , that and that . The solution of the initial value problem
is . We have, as ,
Therefore, if and , then for all .
9 Linear ordinary differential equations over
Let be an matrix over and let . Let , and let be the solution of the initial value problem
As is the complexification of a real linear map, , and
so , i.e. , so for all . But and , so
for all . Also, and , so . Therefore, is the solution of the initial value problem
Thus, to solve an initial value problem in we can complexify it, solve the initial value problem in , and take the first entry of the solution of the complex initial value problem.
If is an matrix over , let
its characteristic polynomial. The Cayley-Hamilton theorem states that
Taking the complexification of this gives
It follows that the roots of are the same as the roots of . A complex root of is not an eigenvalue of , but is indeed an eigenvalue of , so the roots of the characteristic polynomial of are the eigenvalues of .
10 Linear ordinary differential equations in
Let be a matrix over .66 6 This section follows Arnold, p. 132, §20.3. Suppose that the roots of the characteristic polynomial
are , i.e. that the roots of the characteristic polynomial are complex conjugate. Let , .77 7 Define by We have . By the Cayley-Hamilton theorem, so and written using this is Hence so is a complex structure on . is an eigenvalue for , so let , . Furthermore,
so
hence, as ,
Therefore is an eigenvector of with eigenvalue , so and are linearly independent over . If , , then
from which it follows that . Therefore are linearly independent over .
We have
and
so
and hence
Therefore