8 Linear Algebra V

In this class,

  1. UNGRADED: An anti-symmetric matrix is a matrix for which $A^{\rm T}=-A$. Are the eigenvalues of an antisymmetric real matrix real too? To check, write down the simplest nontrivial anti-symmetric $2\times2$ matrix you can think of (which may not be symmetric) and see. In fact, the eigenvalues of an antisymmetric matrix are always purely imaginary, i.e. proportional to $i=\sqrt{-1}$. The (complex) eigenvectors are orthogonal, as long as you remember that in the first vector of a dot product, you must take complex conjugate, i.e. replace every $i$ by $-i$. Verify this for your antisymmetric matrix.

  2. Analyze and accurately draw the quadratic curve

    \begin{displaymath}
- 2 x^2 + xy + 3 y^2 = 5
\end{displaymath}

    using matrix diagonalization. Show the exact, as well as the approximate values for all angles. Repeat for the curve,

    \begin{displaymath}
1 x^2 + xy + 6 y^2 = 5
\end{displaymath}

    Note: if you add say 3 times the unit matrix to a matrix $A$, then the eigenvectors of $A$ do not change. It only causes the eigenvalues to increase by 3, as you can readily verify from the definition of eigenvectors and eigenvalues. Use this to your advantage.

  3. Given

    \begin{displaymath}
A =
\left(
\begin{array}{rrrr}
2 & 1 & 1 \\
1 & 2 & 1 \\
1 & 1 & 2
\end{array} \right)
\end{displaymath}

    Without doing any mathematics, what can you say immediately about the eigenvalues and eigenvectors of this matrix? Now find the equation for the eigenvalues. It is a cubic one. However, one eigenvalue is immediately obvious from looking at $A$. What eigenvalue $\lambda_i$, that makes $A-\lambda_iI$ singular, is immediately obvious from looking at A? Explain. Factor out the corresponding factor $(\lambda-\lambda_i)$ from the cubic, then find the roots of the remaining quadratic. Number the single eigenvalue $\lambda_1$, and the double one $\lambda_2$ and $\lambda_3$. The found two basis vectors of the null space of $A-\lambda_2I$, call them $\vec e_2^{ *}$ and $\vec e_3^{ *}$, will not be orthogonal to each other. To make them orthogonal, you must eliminate the component that $\vec e_3^{ *}$ has in the direction of $\vec
e_2^{ *}$. In particular, if $\vec e_2$ is the unit vector in the direction of $\vec e_2^{ *}$, then $\vec e_3^{ *}\cdot \vec
e_2$ is the scalar component of $\vec e_3^{ *}$ in the direction of $\vec e_2^{ *}$. Multiply by the unit vector $\vec e_2$ to get the vector component, and substract it from $\vec e_3^{ *}$:

    \begin{displaymath}
\vec e_3^{ **} = \vec e_3^{ *} - (\vec e_3^{ *} \cdot \vec e_2)\vec e_2
\end{displaymath}

    (This trick of making vectors orthogonal by substracting away the components in the wrong directions is called Gram-Schmidt orthogonalization.) Now make this vector of length 1. Then describe the transformation of basis that turns matrix $A$ into a diagonal one. What is the transformation matrix $P$ from old to new and what is its inverse? What is the diagonal matrix $A'$? Do your eigenvectors form a right or left-handed coordinate system?