If A is a fixed nxn matrix, and T(B)=AB, prove that if A is diagonalizable then T is diagonalizable?

i know T is represented by a n^2xn^2 matrix, but i don't know how to prove that T is diagonalizable.

1 Answer

Relevance
  • 9 years ago
    Favorite Answer

    I'm guessing that T is a linear transformation from the space of nxn matrices into itself. That's the only way I could see T being an n^2xn^2 matrix.

    Suppose A is diagonalisable. Equivalently, there exists a basis for R^n made entirely out of eigenvectors of A. We need to prove an equivalent statement for T, that is, that there is a basis for R^(n^2) made up entirely of eigenvectors (or eigenmatrices!) for T. The only way I can see to prove the existence of such a basis is to find such a basis.

    Now, we can actually construct eigenmatrices rather easily. We just make a matrix with all its column vectors being either the 0 vector, or eigenvectors for A of the same eigenvalue (not all of them 0, of course). That is, if u(1), u(2), ..., u(n) are either 0 or eigenvectors for the eigenvalue λ, then consider the mapping of the following matrix:

    ( u(1) | u(2) | ... | u(n) )

    Using properties of matrix multiplication:

    T( u(1) | u(2) | ... | u(n) )

    = A( u(1) | u(2) | ... | u(n) )

    = ( Au(1) | Au(2) | ... | Au(n) )

    = ( λu(1) | λu(2) | ... | λu(n) )

    = λ( u(1) | u(2) | ... | u(n) )

    Actually, this turns out to be the only way to construct an eigenmatrix. Given any matrix, with any vectors u(1), u(2), ..., u(n):

    T( u(1) | u(2) | ... | u(n) ) = ( Au(1) | Au(2) | ... | Au(n) )

    which equals

    λ( u(1) | u(2) | ... | u(n) )

    if and only if:

    Au(1) = λu(1)

    Au(2) = λu(2)

    ...

    Au(n) = λu(2)

    So, either u(i) is an eigenvector of A, or it is 0. This also proves that T will have the same eigenvalues as A.

    Now, we know that A has a basis of eigenvectors. Let that basis be:

    {b(1), b(2), ..., b(n)}

    We can construct n linearly independent eigenmatrices from each b(i), simply by:

    ( b(i) | 0 | 0 | ... | 0 )

    ( 0 | b(i) | 0 | ... | 0 )

    ( 0 | 0 | b(i) | ... | 0 )

    ...

    ( 0 | 0 | 0 | ... | b(i) )

    It should be trivial to prove that these matrices are linearly independent, and as we saw, they were all eigenmatrices for T, since b(i) is an eigenvector for A. So, the question now is, are all these matrices linearly independent with each other? We know that the matrices derived from a single eigenvector are linearly independent with each other, but are they linearly independent with respect to the other derived matrices?

    Let b(i, j) be the eigenmatrix with b(i) in the jth column, and 0 columns elsewhere. We just need to prove that:

    Sum, 1 <= i, j <= n of a(i, j)b(i, j) = 0 ===> a(i, j) = 0 for all i, j.

    Collect the terms with the same j value. Then the matrices in each of these groups have the same non-zero column: the jth column. So, any linear combination of them will have 0 columns elsewhere. Not only that, no other matrices have any effect over that column, so if that column happens to be 0 in the final sum, then it is exclusively because of that group. Since every column of the sum happens to be 0, we know that all such groups must total the 0 matrix. That is, for any j:

    a(1, j)b(1, j) + a(2, j)b(2, j) + ... + a(n, j)b(n, j) = 0

    But then, all the action happens in the jth column, where it looks identical to the basis {b(1), b(2), ... b(n)}. Therefore, by the linear independence of {b(1), b(2), ... b(n)}, we can conclude that:

    a(1, j) = a(2, j) = ... a(n, j) = 0

    for any j. Therefore, the set of matrices is linearly independent. By the fact that it has the same number of elements as the dimension of R^(n^2), we can conclude it is a basis of eigenmatrices. Therefore T is diagonalisable.

Still have questions? Get your answers by asking now.