Definition 1. A (linear) basis (or a coordinate system ) in a vector space \(\mathcal{V}\) is a set \(\mathcal{X}\) of linearly independent vectors such that every vector in \(\mathcal{V}\) is a linear combination of elements of \(\mathcal{X}\) . A vector space \(\mathcal{V}\) is finite-dimensional if it has a finite basis.

Except for the occasional consideration of examples we shall restrict our attention, throughout this book, to finite-dimensional vector spaces.

For examples of bases we turn again to the spaces \(\mathcal{P}\) and \(\mathbb{C}^{n}\) . In \(\mathcal{P}\) , the set \(\{x_{n}\}\) , where \(x_{n}(t)=t^{n}\) , \(n=0,1,2, \ldots\) , is a basis; every polynomial is, by definition, a linear combination of a finite number of \(x_{n}\) . Moreover \(\mathcal{P}\) has no finite basis, for, given any finite set of polynomials, we can find a polynomial of higher degree than any of them; this latter polynomial is obviously not a linear combination of the former ones.

An example of a basis in \(\mathbb{C}^{n}\) is the set of vectors \(x_{i}\) , \(i=1, \ldots, n\) , defined by the condition that the \(j\) -th coordinate of \(x_{i}\) is \(\delta_{i j}\) . (Here we use for the first time the popular Kronecker \(\delta\) ; it is defined by \(\delta_{i j}=1\) if \(i=j\) and \(\delta_{i j}=0\) if \(i \neq j\) .) Thus we assert that in \(\mathbb{C}^{3}\) the vectors \(x_{1}=(1,0,0)\) , \(x_{2}=(0,1,0)\) , and \(x_{3}=(0,0,1)\) form a basis. It is easy to see that they are linearly independent; the formula \[x=(\xi_{1}, \xi_{2}, \xi_{3})=\xi_{1} x_{1}+\xi_{2} x_{2}+\xi_{3} x_{3}\] proves that every \(x\) in \(\mathbb{C}^{{3}}\) is a linear combination of them.

In a general finite-dimensional vector space \(\mathcal{V}\) , with basis \(\{x_{1}, \ldots, x_{n}\}\) , we know that every \(x\) can be written in the form \[x=\sum_{i} \xi_{i} x_{i};\] we assert that the \(\xi\) ’s are uniquely determined by \(x\) . The proof of this assertion is an argument often used in the theory of linear dependence. If we had \(x=\sum_{i} \eta_{i} x_{i}\) , then we should have, by subtraction, \[\sum_{i}(\xi_{i}-\eta_{i}) x_{i}=0.\] Since the \(x_{i}\) are linearly independent, this implies that \(\xi_{i}-\eta_{i}=0\) for \(i=1, \ldots, n\) ; in other words, the \(\xi\) ’s are the same as the \(\eta\) ’s. (Observe that writing \(\{x_{1}, \ldots, x_{n}\}\) for a basis with \(n\) elements is not the proper thing to do in case \(n=0\) . We shall, nevertheless, frequently use this notation. Whenever that is done, it is, in principle, necessary to adjoin a separate discussion designed to cover the vector space \(\mathcal{O}\) . In fact, however, everything about that space is so trivial that the details are not worth writing down, and we shall omit them.)

Theorem 1. If \(\mathcal{V}\) is a finite-dimensional vector space and if \(\{y_{1}, \ldots, y_{m}\}\) is any set of linearly independent vectors in \(\mathcal{V}\) , then, unless the \(y\) ’s already form a basis, we can find vectors \(y_{m+1}, \ldots, y_{m+p}\) so that the totality of the \(y\) ’s, that is, \(\{y_{1}, \ldots, y_{m}, y_{m+1}, \ldots, y_{m+p}\}\) , is a basis. In other words, every linearly independent set can be extended to a basis.

Proof. Since \(\mathcal{V}\) is finite-dimensional, it has a finite basis, say \(\{x_{1}, \ldots, x_{n}\}\) . We consider the set \(\mathcal{S}\) of vectors \[y_{1}, \ldots, y_{m}, x_{1}, \ldots, x_{n}\] in this order, and we apply to this set the theorem of Section: Linear combinations several times in succession. In the first place, the set \(\mathcal{S}\) is linearly dependent, since the \(y\) ’s are (as are all vectors) linear combinations of the \(x\) ’s. Hence some vector of \(\mathcal{S}\) is a linear combination of the preceding ones; let \(z\) be the first such vector. Then \(z\) is different from any \(y_{i}\) , \(i=1, \ldots, m\) (since the \(y\) ’s are linearly independent), so that \(z\) is equal to some \(x\) , say \(z=x_{i}\) . We consider the new set \(\mathcal{S}^{\prime}\) of vectors \[y_{1}, \ldots, y_{m}, x_{1}, \ldots, x_{i-1}, x_{i+1}, \ldots, x_{n}.\] We observe that every vector in \(\mathcal{V}\) is a linear combination of vectors in \(\mathcal{S}^{\prime}\) , since by means of \(y_{1}, \ldots, y_{m}, x_{1}, \ldots, x_{i-1}\) we may express \(x_{i}\) , and then by means of \(x_{1}, \ldots, x_{i-1}, x_{i}, x_{i+1}, \ldots, x_{n}\) we may express any vector. (The \(x\) ’s form a basis.) If \(\mathcal{S}^{\prime}\) is linearly independent, we are done. If it is not, we apply the theorem of Section: Linear combinations again and again the same way till we reach a linearly independent set containing \(y_{1}, \ldots, y_{m}\) , in terms of which we may express every vector in \(\mathcal{V}\) . This last set is a basis containing the \(y\) ’s. ◻

EXERCISES

Exercise 1. 

  1. Prove that the four vectors \begin{align} & x=(1,0,0) \\ & y=(0,1,0) \\ & z=(0,0,1) \\ & u=(1,1,1) \end{align} in \(\mathbb{C}^{3}\) form a linearly dependent set, but any three of them are linearly independent. (To test the linear dependence of vectors \(x=(\xi_{1}, \xi_{2}, \xi_{3}), y=(\eta_{1}, \eta_{2}, \eta_{3})\) , and \(z=(\zeta_{1}, \zeta_{2}, \zeta_{2})\) in \(\mathbb{C}^{\mathbf{3}}\) , proceed as follows. Assume that \(\alpha\) , \(\beta\) , and \(\gamma\) can be found so that \(\alpha x+\beta y+\gamma z=0\) . This means that \begin{align} & \alpha \xi_{1}+\beta \eta_{1}+\gamma \xi_{1}=0 \\ & \alpha \xi_{2}+\beta \eta_{2}+\gamma \zeta_{2}=0 \\ & \alpha \xi_{3}+\beta \eta_{3}+\gamma \xi_{3}=0 \end{align} The vectors \(x\) , \(y\) , and \(z\) are linearly dependent if and only if these equations have a solution other than \(\alpha=\beta=\gamma=0\) .)
  2. If the vectors \(x\) , \(y\) , \(z\) , and \(u\) in \(\mathcal{P}\) are defined by \(x(t)=1\) , \(y(t)=t\) , \(z(t)=t^{2}\) , and \(u(t)=1+t+t^{2}\) , prove that \(x\) , \(y\) , \(z\) , and \(u\) are linearly dependent, but any three of them are linearly independent.

Exercise 2. Prove that if \(\mathbb{R}\) is considered as a rational vector space (see. Section: Examples , (8)), then a necessary and sufficient condition that the vectors \(1\) and \(\xi\) in \(\mathbb{R}\) be linearly independent is that the real number \(\xi\) be irrational.

Exercise 3. Is it true that if \(x\) , \(y\) , and \(z\) are linearly independent vectors, then so also are \(x+y\) , \(y+z\) , and \(z+x\) ?

Exercise 4. 

  1. Under what conditions on the scalar \(\xi\) are the vectors \((1+\xi, 1-\xi)\) and \((1-\xi, 1+\xi)\) in \(\mathbb{C}^{2}\) linearly dependent?
  2. Under what conditions on the scalar \(\xi\) are the vectors \((\xi, 1,0)\) , \((1, \xi, 1)\) , and \((0,1, \xi)\) in \(\mathbb{R}^{2}\) linearly dependent?
  3. What is the answer to (b) for \(\mathbb{Q}^{3}\) (in place of \(\mathbb{R}^{3}\) )?

Exercise 5. 

  1. The vectors \((\xi_{1}, \xi_{2})\) and \((\eta_{1}, \eta_{2})\) in \(\mathbb{C}^{2}\) are linearly dependent if and only if \(\xi_{1} \eta_{2}=\xi_{2} \eta_{1}\) .
  2. Find a similar necessary and sufficient condition for the linear dependence of two vectors in \(\mathbb{C}^{2}\) . Do the same for three vectors in \(\mathbb{C}^{3}\) .
  3. Is there a set of three linearly independent vectors in \(\mathbb{C}^{2}\) ?

Exercise 6. 

  1. Under what conditions on the scalars \(\xi\) and \(\eta\) are the vectors \((1, \xi)\) and \((1, \eta)\) in \(\mathbb{C}^{2}\) linearly dependent?
  2. Under what conditions on the scalars \(\xi\) , \(\eta\) , and \(\zeta\) are the vectors \((1, \xi, \xi^{2})\) , \((1, \eta, \eta^{2})\) , and \((1, \zeta, \zeta^{2})\) in \(\mathbb{C}^{2}\) linearly dependent?
  3. Guess and prove a generalization of (a) and (b) to \(\mathbb{C}^{n}\) .

Exercise 7. 

  1. Find two bases in \(\mathbb{C}^{4}\) such that the only vectors common to both are \((0,0,1,1)\) and \((1,1,0,0)\) .
  2. Find two bases in \(\mathbb{C}^{4}\) that have no vectors in common so that one of them contains the vectors \((1,0,0,0)\) and \((1,1,0,0)\) and the other one contains the vectors \((1,1,1,0)\) and \((1,1,1,1)\) .

Exercise 8. 

  1. Under what conditions on the scalar \(\xi\) do the vectors \((1,1,1)\) and \((1, \xi, \xi^{2})\) form a basis of \(\mathbb{C}^{3}\) ?
  2. Under what conditions on the scalar \(\xi\) do the vectors \((0,1, \xi)\) , \((\xi, 0,1)\) , and \((\xi, 1,1+\xi)\) form a basis of \(\mathbb{C}^{3}\) ?

Exercise 9. Consider the set of all those vectors in \(\mathbb{C}^{2}\) each of whose coordinates is either \(0\) or \(1\) ; how many different bases does this set contain?

Exercise 10. If \(\mathcal{X}\) is the set consisting of the six vectors \((1,1,0,0)\) , \((1,0,1,0)\) , \((1,0,0,1)\) , \((0,1,1,0)\) , \((0,1,0,1)\) , \((0,0,1,1)\) in \(\mathbb{C}^{4}\) , find two different maximal linearly independent subsets of \(\mathcal{X}\) . (A maximal linearly independent subset of \(\mathcal{X}\) is a linearly independent subset \(\mathcal{Y}\) of \(\mathcal{X}\) that becomes linearly dependent every time that a vector of \(\mathcal{X}\) that is not already in \(\mathcal{Y}\) is adjoined to \(\mathcal{Y}\) .)

Exercise 11. Prove that every vector space has a basis. (The proof of this fact is out of reach for those not acquainted with some transfinite trickery, such as well-ordering or Zorn’s lemma.)