Linear Algebra – Symmetric matrices and quadratic forms – The spectral theorem
Theorem: A square matrix is orthogonally diagonizable if and only if \(A\) is symmetric.
Examples:
1) Suppose that \(A=\begin{pmatrix}7&2\\2&4\end{pmatrix}\), then \(A\) is a symmetric matrix. Now we have:
\[|A-\lambda I|=\begin{vmatrix}7-\lambda&2\\2&4-\lambda\end{vmatrix}=\lambda^2-11\lambda+24=(\lambda-8)(\lambda-3).\]So the eigenvalues of \(A\) are: \(\lambda_1=8\) and \(\lambda_2=3\). Further we have:
\[\lambda_1=8:\quad\begin{pmatrix}-1&2\\2&-4\end{pmatrix}\sim\begin{pmatrix}1&-2\\0&0\end{pmatrix} \quad\Longrightarrow\quad\text{E}_8=\text{Span}\left\{\begin{pmatrix}2\\1\end{pmatrix}\right\}\]and
\[\lambda_2=3:\quad\begin{pmatrix}4&2\\2&1\end{pmatrix}\sim\begin{pmatrix}2&1\\0&0\end{pmatrix} \quad\Longrightarrow\quad\text{E}_3=\text{Span}\left\{\begin{pmatrix}-1\\2\end{pmatrix}\right\}.\]Note that \(\text{E}_8\perp\text{E}_3\). Now we have: \(A=PDP^T\) with \(P=\dfrac{1}{\sqrt{5}}\begin{pmatrix}2&-1\\1&2\end{pmatrix}\) and \(D=\begin{pmatrix}8&0\\0&3\end{pmatrix}\).
2) Suppose that \(A=\begin{pmatrix}5&-4&-2\\-4&5&2\\-2&2&2\end{pmatrix}\), then \(A\) is a symmetric matrix. Now we have:
\begin{align*} |A-\lambda I|&=\begin{vmatrix}5-\lambda&-4&-2\\-4&5-\lambda&2\\-2&2&2-\lambda\end{vmatrix} =\begin{vmatrix}1-\lambda&1-\lambda&0\\-4&5-\lambda&2\\-2&2&2-\lambda\end{vmatrix} =\begin{vmatrix}1-\lambda&0&0\\-4&9-\lambda&2\\-2&4&2-\lambda\end{vmatrix}\\[2.5mm] &=(1-\lambda)\begin{vmatrix}9-\lambda&2\\4&2-\lambda\end{vmatrix}=(1-\lambda)(\lambda^2-11\lambda+10)=-(\lambda-10)(\lambda-1)^2. \end{align*}So the eigenvalues of \(A\) are: \(\lambda_1=10\) with algebraic multiplicity \(1\) and \(\lambda_2=1\) with algebraic multiplicity \(2\). Further we have:
\[\lambda_1=10:\quad\begin{pmatrix}-5&-4&-2\\-4&-5&2\\-2&2&-8\end{pmatrix}\sim\begin{pmatrix}1&1&0\\1&-1&4\\0&0&0\end{pmatrix} \sim\begin{pmatrix}1&1&0\\0&1&-2\\0&0&0\end{pmatrix}\quad\Longrightarrow\quad\text{E}_{10}=\text{Span}\left\{\begin{pmatrix}-2\\2\\1\end{pmatrix}\right\}\]and
\[\lambda_2=1:\quad\begin{pmatrix}4&-4&-2\\-4&4&2\\-2&2&1\end{pmatrix}\sim\begin{pmatrix}-2&2&1\\0&0&0\\0&0&0\end{pmatrix} \quad\Longrightarrow\quad\text{E}_1=\text{Span}\left\{\begin{pmatrix}1\\1\\0\end{pmatrix},\begin{pmatrix}0\\-1\\2\end{pmatrix}\right\}.\]Also here it is easily seen that \(\text{E}_{10}\perp\text{E}_1\). Using the Gram-Schmidt orthogonalization process (probably) we can construct an orthogonal basis of \(\text{E}_1\):
\[\text{E}_1=\text{Span}\left\{\begin{pmatrix}1\\1\\0\end{pmatrix},\begin{pmatrix}1\\-1\\4\end{pmatrix}\right\} \quad\text{with}\quad\begin{pmatrix}1\\1\\0\end{pmatrix}\perp\begin{pmatrix}1\\-1\\4\end{pmatrix}.\]Finally we obtain that
\[||\begin{pmatrix}-2\\2\\1\end{pmatrix}||=\sqrt{9}=3,\quad||\begin{pmatrix}1\\1\\0\end{pmatrix}||=\sqrt{2}\quad\text{en}\quad ||\begin{pmatrix}1\\-1\\4\end{pmatrix}||=\sqrt{18}=3\sqrt{2}.\]Now we have (for instance), that \(A=PDP^T\) with
\[P=\begin{pmatrix}-\frac{2}{3}&\frac{1}{\sqrt{2}}&\frac{1}{3\sqrt{2}}\\\frac{2}{3}&\frac{1}{\sqrt{2}}&-\frac{1}{3\sqrt{2}}\\ \frac{1}{3}&0&\frac{4}{3\sqrt{2}}\end{pmatrix}=\frac{1}{3\sqrt{2}}\begin{pmatrix}-2\sqrt{2}&3&1\\2\sqrt{2}&3&-1\\ \sqrt{2}&0&4\end{pmatrix}\quad\text{en}\quad D=\begin{pmatrix}10&0&0\\0&1&0\\0&0&1\end{pmatrix}=\text{diag}(10,1,1).\]Spectral decomposition of a symmetric matrix
A symmetric \(n\times n\) matrix \(A\) is orthogonally diagonalizable. This implies that there exists an orthonormal basis \(\{\mathbf{u}_1,\ldots,\mathbf{u}_n\}\) of \(\mathbb{R}^n\), consisting of eigenvectors of \(A\), say: \(A\mathbf{u}_i=\lambda_i\mathbf{u}_i\) for \(i=1,2,\ldots,n\). Then we have: \(A=PDP^T\) with \(P=\Bigg(\mathbf{u}_1\;\ldots\,\mathbf{u}_n\Bigg)\) and \(D=\text{diag}(\lambda_1,\ldots,\lambda_n)\). This can also be written in the form:
\[A=\lambda_1\mathbf{u}_1\mathbf{u}_1^T+\cdots+\lambda_n\mathbf{u}_n\mathbf{u}_n^T.\]This is called a spectral decomposition of the matrix \(A\). Note that each term in this sum is an \(n\times n\) matrix with rank \(1\), since each column of \(\lambda_i\mathbf{u}_i\mathbf{u}_i^T\) is a multiple of \(\mathbf{u}_i\). Every matrix \(\mathbf{u}_i\mathbf{u}_i^T\) is a projection matrix, since \(\mathbf{u}_i\mathbf{u}_i^T\mathbf{x}=(\mathbf{u}_i^T\mathbf{x})\mathbf{u}_i=(\mathbf{x}\cdot\mathbf{u}_i)\mathbf{u}_i\) is the (orthogonal) projection of \(\mathbf{x}\) onto the vector \(\mathbf{u}_i\).
The proof of the spectral decomposition follows by using the column-row expansion:
\[A=PDP^T=\Bigg(\mathbf{u}_1\;\ldots\,\mathbf{u}_n\Bigg)\begin{pmatrix}\lambda_1&&\\&\ddots&\\&&\lambda_n\end{pmatrix} \Bigg(\;\;\begin{matrix}\mathbf{u}_1^T\\\vdots\\\mathbf{u}_n^T\end{matrix}\;\;\Bigg)=\Bigg(\lambda_1\mathbf{u}_1\;\ldots\;\lambda_n\mathbf{u}_n\Bigg) \Bigg(\;\;\begin{matrix}\mathbf{u}_1^T\\\vdots\\\mathbf{u}_n^T\end{matrix}\;\;\Bigg)=\lambda_1\mathbf{u}_1\mathbf{u}_1^T+\cdots+\lambda_n\mathbf{u}_n\mathbf{u}_n^T.\]Example: For \(A=\begin{pmatrix}7&2\\2&4\end{pmatrix}\) we earlier found the eigenvalues \(\lambda_1=8\) and \(\lambda_2=3\) with corresponding eigenvectors \(\mathbf{u}_1=\dfrac{1}{\sqrt{5}}\begin{pmatrix}2\\1\end{pmatrix}\) and \(\mathbf{u}_2=\dfrac{1}{\sqrt{5}}\begin{pmatrix}-1\\2\end{pmatrix}\). Then we have:
\[A=\lambda_1\mathbf{u}_1\mathbf{u}_1^T+\lambda_2\mathbf{u}_2\mathbf{u}_2^T=\frac{8}{5}\begin{pmatrix}2\\1\end{pmatrix}\begin{pmatrix}2&1\end{pmatrix} +\frac{3}{5}\begin{pmatrix}-1\\2\end{pmatrix}\begin{pmatrix}-1&2\end{pmatrix}=\frac{8}{5}\begin{pmatrix}4&2\\2&1\end{pmatrix} +\frac{3}{5}\begin{pmatrix}1&-2\\-2&4\end{pmatrix}.\]Last modified on May 2, 2021