Skip to main content

Worksheet 5.3 Worksheet: bilinear forms

Let \(V\) be a finite-dimensional vector space over \(\R\text{.}\) (We will use a real vector space for simplicity, but most of this works just as well for complex vector spaces.) Recall from Worksheet 3.4 that a linear functional on \(V\) is a linear map \(\phi:V\to \R\text{.}\)
We can extend the idea of a linear functional to functions with more than one argument.

Definition 5.3.1.

A bilinear form on a vector space \(V\) is a function
\begin{equation*} \phi:V\times V\to \R \end{equation*}
that is linear in each variable: for all scalars \(a,b\in\R\) and vectors \(\uu,\vv,\ww\in V\text{,}\)
\begin{align*} \phi(a\uu+b\vv,\ww) \amp = a\phi(\uu,\ww)+b\phi(\vv,\ww)\\ \phi(\uu,a\vv+b\ww) \amp = a\phi(\uu,\vv)+b\phi(\uu,\ww)\text{.} \end{align*}
Another way to put this is as follows: \(\phi\) is bilinear if, for each \(\vv\in V\text{,}\) the functions
\begin{align*} \eta(\ww) \amp =\phi(\vv,\ww)\\ \psi(\ww) \amp = \phi(\ww,\vv) \end{align*}
are both linear functionals on \(V\text{.}\)
Caution: the bilinear form \(\phi\) is not assumed to be symmetric, so the linear functionals \(\eta\) and \(\psi\) defined above are (in general) different functions.
A multilinear form is defined similarly, as a function of two, three, or more vector variables, that is linear in each variable.
An example of a bilinear form is the dot product on \(\R^n\text{.}\) Given \(\vv,\ww\in\R^n\text{,}\) the function \(\phi(\vv,\ww) = \vv\dotp\ww\) defines a bilinear form.
More generally, given any \(n\times n\) matrix \(A\text{,}\) the function
\begin{align*} \phi_A: \R^n\times\R^n \amp \to \R\\ \phi_A(\vv,\ww) \amp = \vv^T A\ww \end{align*}
is a bilinear form.
An important example of a multilinear form is the determinant. Given \(\vv_1,\ldots, \vv_n\in \R^n\text{,}\) the function
\begin{align*} \Omega: \R^n\times\cdots \times\R^n \amp \to \R\\ \Omega(\vv_1,\ldots, \vv_n) \amp = \det\bbm \vv_1\amp\cdots\amp \vv_n\ebm \end{align*}
is an \(n\)-linear form.
For bilinear forms, the example \(\phi_A\) given above can be viewed as prototypical.

1.

Let \(V\) be a finite-dimensional vector space, and let \(B=\basis{e}{n}\) be an ordered basis for \(V\text{.}\) Let \(\phi:V\times V\to \R\) be a bilinear form on \(V\text{.}\)
Show that there exists an \(n\times n\) matrix \(A_\phi\) such that
\begin{equation*} \phi(\vv,\ww) = (C_B(\vv))^TA_\phi C_B(\ww), \end{equation*}
where \(C_B\) denotes the coordinate isomorphism from \(V\) to \(\R^n\text{.}\)
The above exercise tells us that we can study bilinear forms on a vector space by studying their matrix representations. This depends on a choice of basis, but, as one might expect, matrix representations with respect to different bases are similar.

2.

Let \(B_1,B_2\) be two ordered bases for a finite-dimensional vector space \(V\text{,}\) and let \(P=P_{B_1\leftarrow B_2}\) be the change of basis matrix for these bases. Let \(\phi:V\times V\to \R\) be a linear functional on \(V\text{.}\)
If \(A_\phi\) is the matrix of \(\phi\) with respect to the basis \(B_1\text{,}\) show that the matrix of \(\phi\) with respect to \(B_2\) is equal to \(P^TA_\phi P\text{.}\)
Bilinear forms also transform with respect to linear transformations in a manner similar to linear functionals.

3.

Let \(V\) and \(W\) be finite-dimensional vector spaces, and let \(T:V\to W\) be a linear transformation.
If \(\phi:W\times W\to\R\) is a bilinar form on \(W\text{,}\) show that the map \(T^*\phi:V\times V\to \R\) defined by
\begin{equation*} T^*\phi(\vv,\ww) = \phi(T(\vv),T(\ww)) \end{equation*}
is a bilinear form on \(V\text{,}\) called the pullback of \(\phi\) by \(T\text{.}\)
A bilinear form \(\phi\) on \(V\) is symmetric if \(\phi(\vv,\ww)=\phi(\ww,\vv)\) for all \(\vv,\ww\in V\text{,}\) and antisymmetric (or alternating) if \(\phi(\vv,\ww)=-\phi(\ww,\vv)\) for all \(\vv,\ww\) in \(V\text{.}\)
A bilinear form is nondegenerate if, for each nonzero vector \(\vv\in V\text{,}\) there exists a vector \(\ww\in V\) such that \(\phi(\vv,\ww)\neq 0\text{.}\) (Alternatively, for each nonzero \(\vv\in V\text{,}\) the linear functional \(\alpha(\ww)=\phi(\vv,\ww)\) is nonzero.)
Two types of bilinear forms are of particular importance: a symmetric, nondegenerate bilinear form on \(V\) is called an inner product on \(V\text{,}\) if it is also positive-definite: for each \(\vv\in V\text{,}\) \(\phi(\vv,\vv)\geq 0\text{,}\) with equality only if \(\vv=\mathbf{0}\text{.}\) Inner products are a generalization of the dot product from Chapter 3. A future version of this book may take the time to study inner products in more detail, but for now we will look at another type of bilinear form.
A nondegenerate, antisymmetric bilinear form \(\omega\) on \(V\) is called a linear symplectic structure on \(V\text{,}\) and we call the pair \((V,\omega)\) a symplectic vector space. Symplectic structures are important in differential geometry and mathematical physics. (They can be used to encode Hamilton’s equations in classical mechanics.)
An example of a symplectic form on \(\R^2\) is given as follows: for vectors \(\vv = (v_1,v_2)\) and \(\ww = (\ww_1,\ww_2)\text{,}\)
\begin{equation*} \omega(\vv,\ww) = v_1w_2-v_2w_1\text{.} \end{equation*}
If this looks familiar, it’s because this is precisely the determinant of the \(2\times 2\) matrix \(\bbm \vv\amp \ww\ebm\text{.}\)
A more general example is given by \(V=\R^{2n}\text{.}\) If we write \(\vv = \langle a_1,b_1,\ldots, a_n,b_n\rangle\text{,}\) \(\ww = \langle c_1,d_1,\ldots, c_n,d_n\rangle\text{,}\) then the standard symplectic structure on \(\R^{2n}\) is given by
\begin{equation*} \omega(\vv,\ww) = a_1d_1-b_1c_1 + \cdots + a_nd_n-b_nc_n\text{.} \end{equation*}
Note how this resembles a sum of \(2\times 2\) determinants.

Remark 5.3.2.

A theorem that you will not be asked to prove (it’s a long proof...) is that if a vector space \(V\) has a linear symplectic structure \(\omega\text{,}\) then the dimension of \(V\) is even, and \(V\) has a basis \(\{\mathbf{e}_1,\mathbf{f}_1,\ldots, \mathbf{e}_n,\mathbf{f}_n\}\) with respect to which the matrix representation of \(\omega\) is equivalent to the standard symplectic structure on \(\R^{2n}\text{.}\)
We conclude with some interesting connections between complex vector spaces and symplectic and inner product structures.
Here is an observation you may have made before: to any complex number \(a+ib\) we can associate the matrix
\begin{equation*} \bbm a\amp -b\\b\amp a\ebm = a\bbm 1\amp 0\\0\amp 1\ebm + b\bbm 0\amp -1\\1\amp 0\ebm = aI+bJ\text{,} \end{equation*}
where \(I\) is the \(2\times 2\) identity matrix, and \(J^2=-I\text{.}\)
You can even check that multiplying two complex numbers is the same as multiplying the corresponding matrices, as given above!

4.

For the symplectic structure \(\omega(\vv,\ww) = v_1w_2-v_2w_1\) on \(\R^2\text{,}\) as given above, show that the matrix of \(\omega\) with respect to the standard basis is the matrix \(J_1 = \bbm 0\amp -1\\ 1\amp 0\ebm\text{.}\)
Then, for any symplectic vector space \((V,\omega)\text{,}\) show that, with respect to the basis \(\{\mathbf{e}_1,\mathbf{f}_1,\ldots, \mathbf{e}_n,\mathbf{f}_n\}\) described in Remark 5.3.2 above, the matrix of \(\omega\) has the block form
\begin{equation*} J = \bbm J_1 \amp 0 \amp \cdots \amp 0\\ 0\amp J_1\amp \cdots \amp 0\\ \vdots\amp \vdots \amp \ddots\amp \cdots\\0\amp 0\amp \cdots \amp J_1\ebm\text{.} \end{equation*}
There are also interesting relationships between complex inner products, real inner products, and symplectic structures.

5.

Let \(\langle \vv,\ww\rangle\) denote the standard complex inner product on \(\C^n\text{.}\) (Recall that such an inner product is complex linear in the second argument, but for any complex scalar \(c\text{,}\) \(\langle c\vv,\ww\rangle = \overline{c}\langle \vv,\ww\rangle\text{.}\))
Write \(\vv = \mathbf{a}+i\mathbf{b}\) and \(\ww = \mathbf{c}+i\mathbf{d}\text{,}\) where \(\mathbf{a} = \langle a_1, a_2,\ldots, a_n\rangle\in\R^n\) (with similar statements for \(\mathbf{b},\mathbf{c},\mathbf{d}\)). Let \(\xx = \langle a_1,b_1,\ldots, a_n,b_n\rangle, \yy = \langle c_1,d_1,\ldots, c_n,d_n\rangle\in \R^{2n}\text{.}\)
Show that
\begin{equation*} \langle \vv,\ww\rangle = \xx\cdot \yy + i\phi(\xx,\yy)\text{,} \end{equation*}
where \(\phi\) denotes the standard symplectic structure on \(\R^{2n}\text{.}\)
For more reading on multilinear forms and determinants, see the 4th edition of Linear Algebra Done Right, by Sheldon Axler. For more reading on linear symplectic structures, see First Steps in Differential Geometry, by Andrew McInerney.