DSC 140B
Problems tagged with diagonalization

Problems tagged with "diagonalization"

Problem #042

Tags: linear algebra, quiz-03, spectral theorem, eigenvectors, diagonalization, lecture-04

Suppose \(A\) is a \(d \times d\) symmetric matrix.

True or False: There exists an orthonormal basis in which \(A\) is diagonal.

Solution

True.

By the spectral theorem, every \(d \times d\) symmetric matrix has \(d\) mutually orthogonal eigenvectors. If we normalize these eigenvectors, they form an orthonormal basis.

In this eigenbasis, the matrix \(A\) is diagonal: the diagonal entries are the eigenvalues of \(A\).

Problem #053

Tags: linear algebra, quiz-03, lecture-05, eigenvalues, eigenvectors, diagonalization

Let \(\vec f\) be a linear transformation with eigenvectors and eigenvalues:

$$\begin{align*}\hat{u}^{(1)}&= \frac{1}{\sqrt 2}(1, 1)^T & \lambda_1 &= 2\\\hat{u}^{(2)}&= \frac{1}{\sqrt 2}(-1, 1)^T & \lambda_2 &= -1 \end{align*}$$

What is \(\vec f(\vec x)\) for \(\vec x = (3, 1)^T\)?

Solution

\(\vec f(\vec x) = (3, 5)^T\).

We'll take the three step approach of 1) finding the coordinates of \(\vec x\) in the eigenbasis, 2) applying the transformation in that basis (where the matrix of the transformation \(A_\mathcal{U}\) is diagonal and consists of the eigenvalues), and 3) converting back to the standard basis.

We saw in lecture that we can do this all in one go using the change of basis matrix \(U\):

\[\vec f(\vec x) = U^T A_\mathcal{U} U \vec x \]

where \(U\) is the change of basis matrix with the eigenvectors as rows and \(A_\mathcal{U}\) is the diagonal matrix with the eigenvalues on the diagonal. In this case, they are:

\[ U = \begin{pmatrix} \frac{1}{\sqrt 2} & \frac{1}{\sqrt 2}\\[0.5em] \frac{-1}{\sqrt 2} & \frac{1}{\sqrt 2} \end{pmatrix}\qquad A_\mathcal{U} = \begin{pmatrix} 2 & 0 \\ 0 & -1 \end{pmatrix}\]

Then:

$$\begin{align*}\vec f(\vec x) &= U^T A_\mathcal{U} U \vec x \\&= \begin{pmatrix} \frac{1}{\sqrt 2} & \frac{-1}{\sqrt 2}\\[0.5em] \frac{1}{\sqrt 2} & \frac{1}{\sqrt 2} \end{pmatrix}\begin{pmatrix} 2 & 0 \\ 0 & -1 \end{pmatrix}\begin{pmatrix} \frac{1}{\sqrt 2} & \frac{1}{\sqrt 2}\\[0.5em] \frac{-1}{\sqrt 2} & \frac{1}{\sqrt 2} \end{pmatrix}\begin{pmatrix} 3 \\ 1 \end{pmatrix}\\&= \begin{pmatrix} 3 \\ 5 \end{pmatrix}\end{align*}$$

Problem #054

Tags: linear algebra, quiz-03, lecture-05, eigenvalues, eigenvectors, diagonalization

Let \(\vec f\) be a linear transformation with eigenvectors and eigenvalues:

$$\begin{align*}\hat{u}^{(1)}&= \frac{1}{\sqrt{2}}(1, 1, 0)^T & \lambda_1 &= 3\\\hat{u}^{(2)}&= \frac{1}{\sqrt{2}}(1, -1, 0)^T & \lambda_2 &= 2\\\hat{u}^{(3)}&= (0, 0, 1)^T & \lambda_3 &= 1 \end{align*}$$

What is \(\vec f(\vec x)\) for \(\vec x = (1, 1, 2)^T\)?

Solution

\(\vec f(\vec x) = (3, 3, 2)^T\).

We'll take the three step approach of 1) finding the coordinates of \(\vec x\) in the eigenbasis, 2) applying the transformation in that basis (where the matrix of the transformation \(A_\mathcal{U}\) is diagonal and consists of the eigenvalues), and 3) converting back to the standard basis.

We saw in lecture that we can do this all in one go using the change of basis matrix \(U\):

\[\vec f(\vec x) = U^T A_\mathcal{U} U \vec x \]

where \(U\) is the change of basis matrix with the eigenvectors as rows and \(A_\mathcal{U}\) is the diagonal matrix with the eigenvalues on the diagonal. In this case, they are:

\[ U = \begin{pmatrix} \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} & 0\\[0.5em] \frac{1}{\sqrt{2}} & \frac{-1}{\sqrt{2}} & 0\\[0.5em] 0 & 0 & 1 \end{pmatrix}\qquad A_\mathcal{U} = \begin{pmatrix} 3 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 1 \end{pmatrix}\]

Then:

$$\begin{align*}\vec f(\vec x) &= U^T A_\mathcal{U} U \vec x \\&= \begin{pmatrix} \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} & 0\\[0.5em] \frac{1}{\sqrt{2}} & \frac{-1}{\sqrt{2}} & 0\\[0.5em] 0 & 0 & 1 \end{pmatrix}\begin{pmatrix} 3 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 1 \end{pmatrix}\begin{pmatrix} \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} & 0\\[0.5em] \frac{1}{\sqrt{2}} & \frac{-1}{\sqrt{2}} & 0\\[0.5em] 0 & 0 & 1 \end{pmatrix}\begin{pmatrix} 1 \\ 1 \\ 2 \end{pmatrix}\\&= \begin{pmatrix} 3 \\ 3 \\ 2 \end{pmatrix}\end{align*}$$

Problem #055

Tags: linear algebra, quiz-03, lecture-05, eigenvalues, eigenvectors, diagonalization

Let \(\vec f\) be a linear transformation with eigenvectors and eigenvalues:

$$\begin{align*}\hat{u}^{(1)}&= \frac{1}{2}(1, 1, 1, 1)^T & \lambda_1 &= 4\\\hat{u}^{(2)}&= \frac{1}{2}(1, 1, -1, -1)^T & \lambda_2 &= 2\\\hat{u}^{(3)}&= \frac{1}{2}(1, -1, 1, -1)^T & \lambda_3 &= 1\\\hat{u}^{(4)}&= \frac{1}{2}(1, -1, -1, 1)^T & \lambda_4 &= 0 \end{align*}$$

What is \(\vec f(\vec x)\) for \(\vec x = (4, 0, 0, 0)^T\)?

Solution

\(\vec f(\vec x) = (7, 5, 3, 1)^T\).

We'll take the three step approach of 1) finding the coordinates of \(\vec x\) in the eigenbasis, 2) applying the transformation in that basis (where the matrix of the transformation \(A_\mathcal{U}\) is diagonal and consists of the eigenvalues), and 3) converting back to the standard basis.

We saw in lecture that we can do this all in one go using the change of basis matrix \(U\):

\[\vec f(\vec x) = U^T A_\mathcal{U} U \vec x \]

where \(U\) is the change of basis matrix with the eigenvectors as rows and \(A_\mathcal{U}\) is the diagonal matrix with the eigenvalues on the diagonal. In this case, they are:

\[ U = \frac{1}{2}\begin{pmatrix} 1 & 1 & 1 & 1\\ 1 & 1 & -1 & -1\\ 1 & -1 & 1 & -1\\ 1 & -1 & -1 & 1 \end{pmatrix}\qquad A_\mathcal{U} = \begin{pmatrix} 4 & 0 & 0 & 0 \\ 0 & 2 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 0 \end{pmatrix}\]

Then:

$$\begin{align*}\vec f(\vec x) &= U^T A_\mathcal{U} U \vec x \\&= \frac{1}{2}\begin{pmatrix} 1 & 1 & 1 & 1\\ 1 & 1 & -1 & -1\\ 1 & -1 & 1 & -1\\ 1 & -1 & -1 & 1 \end{pmatrix}\begin{pmatrix} 4 & 0 & 0 & 0 \\ 0 & 2 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 0 \end{pmatrix}\frac{1}{2}\begin{pmatrix} 1 & 1 & 1 & 1\\ 1 & 1 & -1 & -1\\ 1 & -1 & 1 & -1\\ 1 & -1 & -1 & 1 \end{pmatrix}\begin{pmatrix} 4 \\ 0 \\ 0 \\ 0 \end{pmatrix}\\&= \begin{pmatrix} 7 \\ 5 \\ 3 \\ 1 \end{pmatrix}\end{align*}$$