Linear Algebra – Matrices – The inverse of a matrix

Definition: A square matrix \(A\) is said to be invertible if there exists a matrix \(C\) such that \(AC=I=CA\). Such a matrix \(C\) is then called an inverse of the matrix \(A\).

Theorem: If a square matrix \(A\) is invertible, then an inverse of \(A\) is unique. Notation: \(A^{-1}\).

Proof: Suppose that \(A\) is invertible and that both \(B\) and \(C\) is an inverse of \(A\). Then we have: \(AB=I=BA\) én \(AC=I=CA\). Then it follows that

\[B=IB=(CA)B=C(AB)=CI=C.\]

Theorem: Let \(A=\begin{pmatrix}a&b\\c&d\end{pmatrix}\). If \(ad-bc=0\), then \(A\) is not invertible (also: singular). If \(ad-bc\neq0\), then \(A\) is invertible and

\[A=\frac{1}{ad-bc}\begin{pmatrix}d&-b\\-c&a\end{pmatrix}.\]

The number \(ad-bc\) is called the determinant of \(A\):

Definition: If \(A=\begin{pmatrix}a&b\\c&d\end{pmatrix}\), then we have:

\[\text{det}(A)=\begin{vmatrix}a&b\\c&d\end{vmatrix}=ad-bc.\]

For a non-square matrix \(A\) it is sometimes possibe to find a (non-square) matrix \(B\) such that \(AB=I\) or \(BA=I\).

Example: Let \(A=\begin{pmatrix}3&-1&1\\-1&1&2\end{pmatrix}\) and \(B=\begin{pmatrix}-1&-1\\-3&-2\\1&1\end{pmatrix}\), then we have:

\[AB=\begin{pmatrix}3&-1&1\\-1&1&2\end{pmatrix}\begin{pmatrix}-1&-1\\-3&-2\\1&1\end{pmatrix}=\begin{pmatrix}1&0\\0&1\end{pmatrix} \quad\text{and}\quad BA=\begin{pmatrix}-1&-1\\-3&-2\\1&1\end{pmatrix}\begin{pmatrix}3&-1&1\\-1&1&2\end{pmatrix}=\begin{pmatrix}-2&0&-3\\-7&1&-7\\2&0&3\end{pmatrix}.\]

The matrix \(B\) is then called a right inverse of \(A\), and the matrix \(A\) is then called a left inverse of \(B\).

However, these are not unique, since we also have (for instance)

\[\begin{pmatrix}3&-1&1\\-1&1&2\end{pmatrix}\begin{pmatrix}2&-4\\4&-9\\-1&3\end{pmatrix}=\begin{pmatrix}1&0\\0&1\end{pmatrix} \quad\text{and}\quad \begin{pmatrix}1&-1&-1\\-2&1&1\end{pmatrix}\begin{pmatrix}-1&-1\\-3&-2\\1&1\end{pmatrix}=\begin{pmatrix}1&0\\0&1\end{pmatrix}.\]

Algoritme: Apply row reduction to the augmented matrix \((A\,|\,I)\). If \(A\) is row equivalent to \(I\), then: \((I\,|\,A^{-1})\). Otherwise, \(A\) is not invertible.

Lay §2.2, Exercise 33: Note that \(\begin{pmatrix}1&0\\1&1\end{pmatrix}^{-1}=\begin{pmatrix}1&0\\-1&1\end{pmatrix}\) and

\[\left(\left.\begin{matrix}1&0&0\\1&1&0\\1&1&1\end{matrix}\,\right|\,\begin{matrix}1&0&0\\0&1&0\\0&0&1\end{matrix}\right)\sim \left(\left.\begin{matrix}1&0&0\\0&1&0\\0&1&1\end{matrix}\,\right|\,\begin{matrix}1&0&0\\-1&1&0\\-1&0&1\end{matrix}\right)\sim \left(\left.\begin{matrix}1&0&0\\0&1&0\\0&0&1\end{matrix}\,\right|\,\begin{matrix}1&0&0\\-1&1&0\\0&-1&1\end{matrix}\right).\]

This leads to the conjecture that if \(A=(a_{ij})\) is an \(n\times n\) matrix with \(a_{ij}=1\) if \(i\geq j\) and \(a_{ij}=0\) if \(i < j\), then \(A^{-1}=B=(b_{ij})\) is the \(n\times n\) matrix with \(b_{ij}=1\) if \(i=j\), \(b_{ij}=-1\) if \(i=j+1\) and \(b_{ij}=0\) otherwise. We can prove this as follows:

\[AB=\Bigg(\mathbf{a}_1\;\mathbf{a}_2\;\ldots\;\mathbf{a}_n\Bigg)B=\Bigg(\mathbf{a}_1-\mathbf{a}_2\;\mathbf{a}_2-\mathbf{a}_3\;\ldots\;\mathbf{a}_{n-1}-\mathbf{a}_n\;\mathbf{a}_n\Bigg)=I\]

and

\[BA=\Bigg(\mathbf{b}_1\;\mathbf{b}_2\;\ldots\;\mathbf{b}_n\Bigg)A=\Bigg(\mathbf{b}_1+\mathbf{b}_2+\cdots+\mathbf{b}_n\;\mathbf{b}_2+\mathbf{b}_3+\cdots+\mathbf{b}_n\;\ldots\;\mathbf{b}_{n-1}+\mathbf{b}_n\;\mathbf{b}_n\Bigg)=I.\]

Lay §2.2, Exercise 34: Note that \(\begin{pmatrix}1&0\\1&2\end{pmatrix}^{-1}=\frac{1}{2}\begin{pmatrix}2&0\\-1&1\end{pmatrix}=\begin{pmatrix}1&0\\-\frac{1}{2}&\frac{1}{2}\end{pmatrix}\) and

\[\left(\left.\begin{matrix}1&0&0\\1&2&0\\1&2&3\end{matrix}\,\right|\,\begin{matrix}1&0&0\\0&1&0\\0&0&1\end{matrix}\right)\sim \left(\left.\begin{matrix}1&0&0\\0&2&0\\0&2&3\end{matrix}\,\right|\,\begin{matrix}1&0&0\\-1&1&0\\-1&0&1\end{matrix}\right)\sim \left(\left.\begin{matrix}1&0&0\\0&2&0\\0&0&3\end{matrix}\,\right|\,\begin{matrix}1&0&0\\-1&1&0\\0&-1&1\end{matrix}\right) \quad\Longrightarrow\quad\begin{pmatrix}1& 0&0\\1&2&0\\1&2&3\end{pmatrix}^{-1}=\begin{pmatrix}1&0&0\\-\frac{1}{2}&\frac{1}{2}&0\\0&-\frac{1}{3}&\frac{1}{3}\end{pmatrix}.\]

This leads to the conjecture that if \(A=\begin{pmatrix}1&0&0&\ldots&0&0&0\\1&2&0&\ldots&0&0&0\\1&2&3&\ldots&0&0&0\\\vdots&\vdots&\vdots&\ddots&\vdots&\vdots&\vdots\\ 1&2&3&\ldots&n-2&0&0\\1&2&3&\ldots&n-2&n-1&0\\1&2&3&\ldots&n-2&n-1&n\end{pmatrix}\), then \(A^{-1}=\begin{pmatrix}1&0&0&\ldots&0&0&0\\-\frac{1}{2}&\frac{1}{2}&0&\ldots&0&0&0\\0&-\frac{1}{3}&\frac{1}{3}&\ldots&0&0&0\\ \vdots&\vdots&\vdots&\ddots&\vdots&\vdots&\vdots\\0&0&0&\ldots&\frac{1}{n-2}&0&0\\0&0&0&\ldots&-\frac{1}{n-1}&\frac{1}{n-1}&0\\0&0&0&\ldots&0&-\frac{1}{n}&\frac{1}{n}\end{pmatrix}\). We can prove this as follows:

\[AB=\Bigg(\mathbf{a}_1\;\mathbf{a}_2\;\ldots\;\mathbf{a}_n\Bigg)B=\Bigg(\mathbf{a}_1-\frac{1}{2}\mathbf{a}_2\;\mathbf{a}_2-\frac{1}{3}\mathbf{a}_3\;\ldots\;\frac{1}{n-1}\mathbf{a}_{n-1}-\frac{1}{n}\mathbf{a}_n\;\frac{1}{n}\mathbf{a}_n\Bigg)=I\]

and

\[BA=\Bigg(\mathbf{b}_1\;\mathbf{b}_2\;\ldots\;\mathbf{b}_n\Bigg)A=\Bigg(\mathbf{b}_1+\mathbf{b}_2+\cdots+\mathbf{b}_n\;2\mathbf{b}_2+2\mathbf{b}_3+\cdots+2\mathbf{b}_n\;\ldots\;(n-1)\mathbf{b}_{n-1}+(n-1)\mathbf{b}_n\;n\mathbf{b}_n\Bigg)=I.\]

Definition: A linear transformation \(T:\mathbb{R}^n\to\mathbb{R}^n\) is said to be invertible if there exists a transformation \(S:\mathbb{R}^n\to\mathbb{R}^n\) such that \(S(T(\mathbf{x}))=\mathbf{x}\) and \(T(S(\mathbf{x}))=\mathbf{x}\) for all \(\mathbf{x}\in\mathbb{R}^n\). The (unique linear) transformation \(S\) is called the inverse of \(T\). Notation: \(S=T^{-1}\).

Theorem: Let \(T:\mathbb{R}^n\to\mathbb{R}^n\) be a linear transformation with standard matrix \(A\). Then we have: \(T\) is invertible if and only if \(A\) is an invertible matrix. In that case, the linear transformation \(S:\mathbb{R}^n\to\mathbb{R}^n\) with \(S(\mathbf{x})=A^{-1}\mathbf{x}\) is the unique inverse (transformation) of \(T\).


Last modified on March 1, 2021
© Roelof Koekoek

Metamenu