We shall say, whenever \(x=\sum_{i} \alpha_{i} x_{i}\) , that \(x\) is a linear combination of \(\{x_{i}\}\) ; we shall use without any further explanation all the simple grammatical implications of this terminology. Thus we shall say, in case \(x\) is a linear combination of \(\{x_{i}\}\) , that \(x\) is linearly dependent on \(\{x_{i}\}\) ; we shall leave to the reader the proof that if \(\{x_{i}\}\) is linearly independent, then a necessary and sufficient condition that \(x\) be a linear combination of \(\{x_{i}\}\) is that the enlarged set, obtained by adjoining \(x\) to \(\{x_{i}\}\) , be linearly dependent. Note that, in accordance with the definition of an empty sum, the origin is a linear combination of the empty set of vectors; it is, moreover, the only vector with this property.
The following theorem is the fundamental result concerning linear dependence.
Theorem 1. The set of non-zero vectors \(x_{1}, \ldots, x_{n}\) is linearly dependent if and only if some \(x_{k}\) , \(2 \leq k \leq n\) , is a linear combination of the preceding ones.
Proof. Let us suppose that the vectors \(x_{1}, \ldots, x_{n}\) are linearly dependent, and let \(k\) be the first integer between \(2\) and \(n\) for which \(x_{1}, \ldots, x_{k}\) are linearly dependent. (If worse comes to worst, our assumption assures us that \(k=n\) will do.) Then \[\alpha_{1} x_{1}+\cdots+\alpha_{k} x_{k}=0\] for a suitable set of \(\alpha\) ’s (not all zero); moreover, whatever the \(\alpha\) ’s, we cannot have \(\alpha_{k}=0\) , for then we should have a linear dependence relation among \(x_{1}, \ldots, x_{k-1}\) , contrary to the definition of \(k\) . Hence \[x_{k}=\frac{-\alpha_{1}}{\alpha_{k}} x_{1}+\cdots+\frac{-\alpha_{k-1}}{\alpha_{k}} x_{k-1}\] as was to be proved. This proves the necessity of our condition; sufficiency is clear since, as we remarked before, every set containing a linearly dependent set is itself such. ◻