8 Linear Algebra V

In this class,

  1. Here are some quick ones. For each answer, explain why.
    1. If a matrix is singular, how does that reflect in its eigenvalues?
    2. Is it possible for an $n\times n$ matrix with all $n$ eigenvalues different to be defective?
    3. Is a square null-matrix defective? Singular?
    4. Is a unit matrix defective? Singular?
    5. Is a square matrix with all coefficients 1 singular? Defective?
    6. Is a square matrix with all coefficients 0 except $a_{12}=1$ singular? Defective? How many independent eigenvectors are there? To answer this, find the eigenvalue(s) and dimension of their null spaces for this simple triangular matrix.
    7. Is a square matrix with all coefficients 0 except $a_{i i+1}=1$ for $i=1,2,\ldots,n-1$ singular? Defective? How many independent eigenvectors are there?

  2. An anti-symmetric matrix is a matrix for which $A^{\rm T}=-A$. Are the eigenvalues of an antisymmetric matrix real too? To check, write down a nontrivial anti-symmetric $2\times2$ matrix and see. In fact, the eigenvalues of an antisymmetric matrix are always purely imaginary, i.e. proportional to $i=\sqrt{-1}$. The (complex) eigenvectors are orthogonal, as long as you remember that in the first vector of a dot product, you must take complex conjugate, i.e. replace every $i$ by $-i$. Verify this for your antisymmetric matrix.

  3. Analyze and accurately draw the quadratic curve

    \begin{displaymath}
- 2 x^2 + xy + 3 y^2 = 5
\end{displaymath}

    using matrix diagonalization. Show the exact, as well as the approximate values for all angles. Repeat for the curve,

    \begin{displaymath}
1 x^2 + xy + 6 y^2 = 5
\end{displaymath}

    Note: if you add say 3 times the unit matrix to a matrix $A$, then the eigenvectors of $A$ do not change. It only causes the eigenvalues to increase by 3, as you can readily verify from the definition of eigenvector. Use this to your advantage.

  4. Given

    \begin{displaymath}
A =
\left(
\begin{array}{rrrr}
2 & 1 & 1 \\
1 & 2 & 1 \\
1 & 1 & 2
\end{array} \right)
\end{displaymath}

    Without doing any mathematics, what can you say immediately about the eigenvalues and eigenvectors of this matrix? Now find the equation for the eigenvalues. It is a cubic one. However, one eigenvalue is immediately obvious from looking at $A$. What eigenvalue $\lambda_i$, that makes $A-\lambda_iI$ singular, is immediately obvious from looking at A? Explain. Factor out the corresponding factor $(\lambda-\lambda_i)$ from the cubic, then find the roots of the remaining quadratic. Number the single eigenvalue $\lambda_1$, and the double one $\lambda_2$ and $\lambda_3$. The found two basis vectors of the null space of $A-\lambda_2I$, call them $\vec e_2^{ *}$ and $\vec e_3^{ *}$, will not be orthogonal to each other. To make them orthogonal, you must eliminate the component that $\vec e_3^{ *}$ has in the direction of $\vec
e_2^{ *}$. In particular, if $\vec e_2$ is the unit vector in the direction of $\vec e_2^{ *}$, then $\vec e_3^{ *}\cdot \vec
e_2$ is the scalar component of $\vec e_3^{ *}$ in the direction of $\vec e_2^{ *}$. Multiply by the unit vector $\vec e_2$ to get the vector component, and substract it from $\vec e_3^{ *}$:

    \begin{displaymath}
\vec e_3^{ **} = \vec e_3^{ *} - (\vec e_3^{ *} \cdot \vec e_2)\vec e_2
\end{displaymath}

    (This trick of making vectors orthogonal by substracting away the components in the wrong directions is called Gram-Schmidt orthogonalization.) Now make this vector of length 1. Then describe the transformation of basis that turns matrix $A$ into a diagonal one. What is the transformation matrix $P$ from old to new and what is its inverse? What is the diagonal matrix $A'$? Do your eigenvectors form a right or left-handed coordinate system?