We conclude our discussion of rank by a description of the matrices of linear transformations of rank \(\leq 1\) .

Theorem 1. If a linear transformation \(A\) on a finite-dimensional vector space \(\mathcal{V}\) is such that \(\rho(A) \leq 1\) (that is, \(\rho(A)=0\) or \(\rho(A)=1\) ), then the elements of the matrix \([A]=(\alpha_{i j})\) of \(A\) have the form \(\alpha_{i j}=\beta_{i} \gamma_{j}\) in every coordinate system; conversely if the matrix of \(A\) has this form in some one coordinate system, then \(\rho(A) \leq 1\) .

Proof. If \(\rho(A)=0\) , then \(A=0\) , and the statement is trivial. If \(\rho(A)=1\) , that is, \(\mathcal{R}(A)\) is one-dimensional, then there exists in \(\mathcal{R}(A)\) a non-zero vector \(x_{0}\) (a basis in \(\mathcal{R}(A)\) ) such that every vector in \(\mathcal{R}(A)\) is a multiple of \(x_{0}\) . Hence, for every \(x\) , \[A x=y_{0} x_{0},\] where the scalar coefficient \(y_{0}\) ( \(=y_{0}(x)\) ) depends, of course, on \(x\) . The linearity of \(A\) implies that \(y_{0}\) is a linear functional on \(\mathcal{V}\) . Let \(\mathcal{X}=\{x_{1}, \ldots, x_{n}\}\) be a basis in \(\mathcal{V}\) , and let \((\alpha_{i j})\) be the corresponding matrix of \(A\) , so that \[A x_{j}=\sum_{i} \alpha_{i j} x_{i}.\] If \(\mathcal{X}^{\prime}=\{y_{1}, \ldots, y_{n}\}\) is the dual basis in \(\mathcal{V}^{\prime}\) , then (cf. Section: Adjoints of projections , (2)) \[\alpha_{i j}=[A x_{j}, y_{i}].\] In the present case \[\alpha_{i j}=[y_{0}(x_{j}) x_{0}, y_{i}]=y_{0}(x_{j})[x_{0}, y_{i}]=[x_{0}, y_{i}][x_{j}, y_{0}];\] in other words, we may take \(\beta_{i}=[x_{0}, y_{i}]\) and \(\gamma_{j}=[x_{j}, y_{0}]\) .

Conversely, suppose that in a fixed coordinate system \(\mathcal{X}=\{x_{1}, \ldots, x_{n}\}\) the matrix \((\alpha_{i j})\) of \(A\) is such that \(\alpha_{i j}=\beta_{i} \gamma_{j}\) . We may find a linear functional \(y_{0}\) such that \(\gamma_{j}=[x_{j}, y_{0}]\) , and we may define a vector \(x_{0}\) by \(x_{0}=\sum_{k} \beta_{k} x_{k}\) . The linear transformation \(\tilde{A}\) defined by \(\tilde{A} x=y_{0}(x) x_{0}\) is clearly of rank one (unless, of course, \(\alpha_{i j}=0\) for all \(i\) and \(j\) ), and its matrix \((\tilde{\alpha}_{i j})\) in the coordinate system \(\mathcal{X}\) is given by \[\tilde{\alpha}_{i j}=[\tilde{A} x_{j}, y_{i}]\] (where \(\mathcal{X}^{\prime}=\{y_{1}, \ldots, y_{n}\}\) is the dual basis of \(\mathcal{X}\) ). Hence \[\tilde{\alpha}_{i j}=[y_{0}(x_{j}) x_{0}, y_{i}]=[x_{0}, y_{i}][x_{j}, y_{0}]=\beta_{i} \gamma_{j},\] and, since \(A\) and \(\tilde{A}\) have the same matrix in one coordinate system, it follows that \(\tilde{A}=A\) . This concludes the proof of the theorem. ◻

The following theorem sometimes makes it possible to apply Theorem 1 to obtain results about an arbitrary linear transformation.

Theorem 2. If \(A\) is a linear transformation of rank \(\rho\) on a finite-dimensional vector space \(\mathcal{V}\) , then \(A\) may be written as the sum of \(\rho\) transformations of rank one.

Proof. Since \(A\mathcal{V} = \mathcal{R}(A)\) has dimension \(\rho\) , we may find \(\rho\) vectors \(x_{1}, \ldots, x_{\rho}\) that form a basis for \(\mathcal{R}(A)\) . It follows that, for every vector \(x\) in \(\mathcal{V}\) , we have \[Ax = \sum_{i = 1}^\rho \xi_i x_i,\] where each \(\xi_{i}\) depends, of course, on \(x\) ; we write \(\xi_{i}=y_{i}(x)\) . It is easy to see that \(y_{i}\) is a linear functional. In terms of these \(y_{i}\) we define, for each \(i=1, \ldots, \rho\) , a linear transformation \(A_{i}\) by \(A_{i} x=y_{i}(x) x_{i}\) . It follows that each \(A_{i}\) has rank one and \(A=\sum_{i=1}^\rho A_{i}\) . (Compare this result with Section: Linear transformations , example (2).) ◻

A slight refinement of the proof just given yields the following result.

Theorem 3. Corresponding to any linear transformation \(A\) on a finitedimensional vector space \(\mathcal{V}\) there is an invertible linear transformation \(P\) for which \(P A\) is a projection.

Proof. Let \(\mathcal{R}\) and \(\mathcal{N}\) , respectively, be the range and the null-space of \(A\) , and let \(\{x_{1}, \ldots, x_{\rho}\}\) be a basis for \(\mathcal{R}\) . Let \(x_{\rho+1}, \ldots, x_{n}\) be vectors such that \(\{x_{1}, \ldots, x_{n}\}\) is a basis for \(\mathcal{V}\) . Since \(x_{i}\) is in \(\mathcal{R}\) for \(i=1, \ldots, \rho\) , we may find vectors \(y_i\) such that \(Ay_i = x_i\) ; finally, we choose a basis for \(\mathcal{N}\) , which we may denoted by \(\{y_{\rho + 1}, \ldots, y_n\}\) . We assert that \(\{y_{1}, \ldots, y_{n}\}\) is a basis for \(\mathcal{V}\) . We need, of course, to prove only that the \(y\) ’s are linearly independent. For this purpose we suppose that \(\sum_{i=1}^{n} \alpha_{i} y_{i}=0\) ; then we have (remembering that for \(i=\rho+1, \ldots, n\) the vector \(y_{i}\) belongs to \(\mathcal{N}\) ) \[A\Big(\sum_{i=1}^{n} \alpha_{i} y_{i}\Big)=\sum_{i=1}^{\rho} \alpha_{i} x_{i}=0,\] whence \(\alpha_{1}=\cdots=\alpha_{\rho}=0\) . Consequently \(\sum_{i=\rho+1}^{n} \alpha_{i} y_{i}=0\) ; the linear independence of \(y_{\rho+1}, \ldots, y_{n}\) shows that the remaining \(\alpha\) ’s must also vanish.

A linear transformation \(P\) , of the kind whose existence we asserted, is now determined by the conditions \(P x_{i}=y_{i}\) , \(i=1, \ldots, n\) . Indeed, if \(i=1, \ldots, \rho\) , then \(P A y_{i}=P x_{i}=y_{i}\) , and if \(i=\rho+1, \ldots, n\) , then \(P A y_{i}=P 0=0\) . ◻

Consideration of the adjoint of \(A\) , together with the reflexivity of \(\mathcal{V}\) , shows that we may also find an invertible \(Q\) for which \(A Q\) is a projection. In case \(A\) itself is invertible, we must have \(P=Q=A^{-1}\) .

EXERCISES

Exercise 1. What is the rank of the differentiation operator on \(\mathcal{P}_{n}\) ? What is its nullity?

Exercise 2. Find the ranks of the following matrices.

  1. \(\begin{bmatrix} 1 & 1 & 1 \\ 1 & 1 & 1 \\ 1 & 1 & 1 \end{bmatrix}\) 
  2. \(\begin{bmatrix} 1 & 1 & 1 \\ 1 & 1 & 0 \\ 1 & 0 & 0 \end{bmatrix}\) 
  3. \(\begin{bmatrix} 0 & 0 & 1 \\ 0 & 1 & 0 \\ 1 & 0 & 0 \end{bmatrix}\) 
  4. \(\begin{bmatrix} 0 & 1 & 0 \\ 1 & 0 & 1 \\ 0 & 1 & 0 \end{bmatrix}\) 

Exercise 3. If \(A\) is left multiplication by \(P\) on a space of linear transformations (cf. Section: Matrices of transformations , Ex. 5), and if \(P\) has rank \(m\) , what is the rank of \(A\) ?

Exercise 4. The rank of the direct sum of two linear transformations (on finite-dimensional vector spaces) is the sum of their ranks.

Exercise 5. 

  1. If \(A\) and \(B\) are linear transformations on an \(n\) -dimensional vector space, and if \(A B=0\) , then \(\rho(A)+\rho(B) \leq n\) .
  2. For each linear transformation \(A\) on an \(n\) -dimensional vector space there exists a linear transformation \(B\) such that \(A B=0\) and such that \(\rho(A)+\rho(B)=n\) .

Exercise 6. If \(A\) , \(B\) , and \(C\) are linear transformations on a finite-dimensional vector space, then \[\rho(A B)+\rho(B C) \leq \rho(B)+\rho(A B C).\] 

Exercise 7. Prove that two linear transformations (on the same finite-dimensional vector space) are equivalent if and only if they have the same rank.

Exercise 8. 

  1. Suppose that \(A\) and \(B\) are linear transformations (on the same finite-dimensional vector space) such that \(A^{2}=A\) and \(B^{2}=B\) . Is it true that \(A\) and \(B\) are similar if and only if \(\rho(A)=\rho(B)\) ?
  2. Suppose that \(A\) and \(B\) are linear transformations (on the same finite-dimensional vector space) such that \(A \neq 0\) , \(B \neq 0\) , and \(A^{2}=B^{2}=0\) . Is it true that \(A\) and \(B\) are similar if and only if \(\rho(A)=\rho(B)\) ?

Exercise 9. 

  1. If \(A\) is a linear transformation of rank one, then there exists a unique scalar \(\alpha\) such that \(A^{2}=\alpha A\) .
  2. If \(\alpha \neq 1\) , then \(1-A\) is invertible.