Linear Algebra – Systems of linear equations – Linear independence
Definition: A set of vectors \(\{\mathbf{v}_1,\mathbf{v}_2,\ldots,\mathbf{v}_p\}\) in \(\mathbb{R}^n\) is called linear independent if the vector equation \(x_1\mathbf{v}_1+x_2\mathbf{v}_2+\cdots+c_p\mathbf{v}_p=\mathbf{0}\) has only the trivial solution (\(x_1=0,\;x_2=0,\;\ldots,\;x_p=0\)). Otherwise, \(\{\mathbf{v}_1,\mathbf{v}_2,\ldots,\mathbf{v}_p\}\) is called linear dependent.
Example: Consider \(\{\mathbf{v}_1,\mathbf{v}_2\mathbf{v}_3\}\) with \(\mathbf{v}_1=\begin{pmatrix}1\\-1\\0\end{pmatrix}\), \(\mathbf{v}_2=\begin{pmatrix}1\\0\\1\end{pmatrix}\) and \(\mathbf{v}_3=\begin{pmatrix}0\\1\\1\end{pmatrix}\).
Consider the vector equation \(x_1\mathbf{v}_1+x_2\mathbf{v}_2+x_3\mathbf{v}_3=\mathbf{0}\):
\[\begin{pmatrix}1&1&0\\-1&0&1\\0&1&1\end{pmatrix}\sim\begin{pmatrix}1&1&0\\0&1&1\\0&1&1\end{pmatrix} \sim\begin{pmatrix}1&0&-1\\0&1&1\\0&0&0\end{pmatrix}\quad\Longrightarrow\quad \left\{\begin{array}{l}x_1=x_3\\x_2=-x_3\\x_3\;\text{is vrij.}\end{array}\right.\]There are free variables, hence: \(\{\mathbf{v}_1,\mathbf{v}_2\mathbf{v}_3\}\) is linear dependent. For instance, we have: \(\mathbf{v}_1-\mathbf{v}_2+\mathbf{v}_3=\mathbf{0}\) or equivalently \(\mathbf{v}_3=-\mathbf{v}_1+\mathbf{v}_2\).
Theorem: The columns of a matrix \(A\) are linear independent if and only if \(A\mathbf{x}=\mathbf{0}\) has only the trivial solution \(\mathbf{x}=\mathbf{0}\) heeft.
Then \(A\) has a pivot position in every column, since there should be no free variables then.
Theorem: A set of two vectors \(\{\mathbf{v}_1\mathbf{v}_2\}\) is linear dependent if and only if at least one of the vectors is a multiple of the other. The set is linear independent if neither of the vectors is a multiple of the other.
Theorem: A set \(\{\mathbf{v}_1,\mathbf{v}_2,\ldots,\mathbf{v}_p\}\) of two or more vectors is linear dependent if and only if at least one of the vectors is e linear combination of the other vectors.
Warning: This does not say that every vector in the set is a linear combination of the other vectors.
Theorem: A set of vectors \(\{\mathbf{v}_1,\mathbf{v}_2,\ldots,\mathbf{v}_p\}\) in \(\mathbb{R}^n\), that contains the zero vector, is linear dependent.
Proof: Suppose that \(\mathbf{v}_1=\mathbf{0}\), then we have for instance: \(17\mathbf{v}_1+0\mathbf{v}_2+\cdots+0\mathbf{v}_p=\mathbf{0}\). Hence: \(\{\mathbf{v}_1,\mathbf{v}_2,\ldots,\mathbf{v}_p\}\) is linear dependent.
Theorem: A set of vectors \(\{\mathbf{v}_1,\mathbf{v}_2,\ldots,\mathbf{v}_p\}\) in \(\mathbb{R}^n\) with \(p>n\) is linear dependent.
Proof: Suppose that \(A=\Bigg(\mathbf{v}_1\;\mathbf{v}_2\;\ldots\;\mathbf{v}_p\Bigg)\). Then \(A\) is an \(n\times p\) matrix. Hence: \(A\mathbf{x}=\mathbf{0}\) is a homogeneous system of \(n\) equations in \(p\) unknowns. Since \(n < p\) there are at most \(n\) pivots and therefore at most \(n\) basic variables and therefore at least \(p-n > 0\) free variables. Hence: \(\{\mathbf{v}_1,\mathbf{v}_2,\ldots,\mathbf{v}_p\}\) is linear dependent.
Examples:
- The set \(\left\{\begin{pmatrix}1\\5\end{pmatrix},\begin{pmatrix}2\\-1\end{pmatrix},\begin{pmatrix}3\\4\end{pmatrix}\right\}\) is linear dependent.
- The set \(\left\{\begin{pmatrix}1\\0\\-1\end{pmatrix},\begin{pmatrix}2\\3\\-1\end{pmatrix},\begin{pmatrix}3\\1\\2\end{pmatrix},\begin{pmatrix}-1\\0\\5\end{pmatrix}\right\}\) is linear dependent.
Last modified on March 22, 2021