Understanding the linear transformation given by matrix multiplication

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












I am working on the following problem. I found it in an old qualifying exam, but I'm not sure of its original source. It asks:



Let $A$ be an $n times n$ matrix with entries in $mathbbR$. Let $phi_A colon mathbbR^n to mathbbR^n$ be the linear transformation defined by $phi_A(v) = A cdot v$ for each column vector $v in mathbbR^n$. Set $W = v in mathbbR^n mid phi_A(v) = v $. and assume that $dim ker phi_A + dim W = n$.



  1. Give the minimal polynomial for $A$.

  2. Describe all possible Jordan canonical forms for $A$.

  3. Prove that if $A$ is a symmetric matrix, then $W$ is orthogonal to $ker phi_A$. Assume that $mathbbR^n$ is endowed with its standard inner product.

I don't understand the different between saying that $A$ is a matrix for $[T]_beta$ with respect to some basis $beta$ of $mathbbR^n$ and saying that $phi_A$ is the linear transformation defined by $phi_A(v) = A(v)$.



In any case, here are the rest of my thoughts: It appears that, given the restriction on the dimensions of $ker phi_A$ and $dim W$ that $A$ is an idempotent matrix. That is, it either fixes or zeros an arbitrary vector $v in mathbbR^n$. It follows that the minimal polynomial is $M_A(x) = x(x-1)$.



Since the minimal polynomial is a product of distinct linear factors, $A$ is diagonalizable. Thus if $r = dim ker phi_A = dim operatornamenull A$ then there are $r$ 0's on the diagonal and $n-r$ 1's on the diagonal. There are a total of $n-1$ such diagonal matrices, up to permutation of the basis.



For the last part, if $A$ is a real symmetric matrix, then it is self-adjoint. For any $v in W$ and $u in operatornamenull A$ we have
$$(u,v) = (Au,v) = (u,A^*v) = (u,Av) = (u,0) = 0 $$
where the last equality follows since our space is equipped with the standard inner product.







share|cite|improve this question

























    up vote
    0
    down vote

    favorite












    I am working on the following problem. I found it in an old qualifying exam, but I'm not sure of its original source. It asks:



    Let $A$ be an $n times n$ matrix with entries in $mathbbR$. Let $phi_A colon mathbbR^n to mathbbR^n$ be the linear transformation defined by $phi_A(v) = A cdot v$ for each column vector $v in mathbbR^n$. Set $W = v in mathbbR^n mid phi_A(v) = v $. and assume that $dim ker phi_A + dim W = n$.



    1. Give the minimal polynomial for $A$.

    2. Describe all possible Jordan canonical forms for $A$.

    3. Prove that if $A$ is a symmetric matrix, then $W$ is orthogonal to $ker phi_A$. Assume that $mathbbR^n$ is endowed with its standard inner product.

    I don't understand the different between saying that $A$ is a matrix for $[T]_beta$ with respect to some basis $beta$ of $mathbbR^n$ and saying that $phi_A$ is the linear transformation defined by $phi_A(v) = A(v)$.



    In any case, here are the rest of my thoughts: It appears that, given the restriction on the dimensions of $ker phi_A$ and $dim W$ that $A$ is an idempotent matrix. That is, it either fixes or zeros an arbitrary vector $v in mathbbR^n$. It follows that the minimal polynomial is $M_A(x) = x(x-1)$.



    Since the minimal polynomial is a product of distinct linear factors, $A$ is diagonalizable. Thus if $r = dim ker phi_A = dim operatornamenull A$ then there are $r$ 0's on the diagonal and $n-r$ 1's on the diagonal. There are a total of $n-1$ such diagonal matrices, up to permutation of the basis.



    For the last part, if $A$ is a real symmetric matrix, then it is self-adjoint. For any $v in W$ and $u in operatornamenull A$ we have
    $$(u,v) = (Au,v) = (u,A^*v) = (u,Av) = (u,0) = 0 $$
    where the last equality follows since our space is equipped with the standard inner product.







    share|cite|improve this question























      up vote
      0
      down vote

      favorite









      up vote
      0
      down vote

      favorite











      I am working on the following problem. I found it in an old qualifying exam, but I'm not sure of its original source. It asks:



      Let $A$ be an $n times n$ matrix with entries in $mathbbR$. Let $phi_A colon mathbbR^n to mathbbR^n$ be the linear transformation defined by $phi_A(v) = A cdot v$ for each column vector $v in mathbbR^n$. Set $W = v in mathbbR^n mid phi_A(v) = v $. and assume that $dim ker phi_A + dim W = n$.



      1. Give the minimal polynomial for $A$.

      2. Describe all possible Jordan canonical forms for $A$.

      3. Prove that if $A$ is a symmetric matrix, then $W$ is orthogonal to $ker phi_A$. Assume that $mathbbR^n$ is endowed with its standard inner product.

      I don't understand the different between saying that $A$ is a matrix for $[T]_beta$ with respect to some basis $beta$ of $mathbbR^n$ and saying that $phi_A$ is the linear transformation defined by $phi_A(v) = A(v)$.



      In any case, here are the rest of my thoughts: It appears that, given the restriction on the dimensions of $ker phi_A$ and $dim W$ that $A$ is an idempotent matrix. That is, it either fixes or zeros an arbitrary vector $v in mathbbR^n$. It follows that the minimal polynomial is $M_A(x) = x(x-1)$.



      Since the minimal polynomial is a product of distinct linear factors, $A$ is diagonalizable. Thus if $r = dim ker phi_A = dim operatornamenull A$ then there are $r$ 0's on the diagonal and $n-r$ 1's on the diagonal. There are a total of $n-1$ such diagonal matrices, up to permutation of the basis.



      For the last part, if $A$ is a real symmetric matrix, then it is self-adjoint. For any $v in W$ and $u in operatornamenull A$ we have
      $$(u,v) = (Au,v) = (u,A^*v) = (u,Av) = (u,0) = 0 $$
      where the last equality follows since our space is equipped with the standard inner product.







      share|cite|improve this question













      I am working on the following problem. I found it in an old qualifying exam, but I'm not sure of its original source. It asks:



      Let $A$ be an $n times n$ matrix with entries in $mathbbR$. Let $phi_A colon mathbbR^n to mathbbR^n$ be the linear transformation defined by $phi_A(v) = A cdot v$ for each column vector $v in mathbbR^n$. Set $W = v in mathbbR^n mid phi_A(v) = v $. and assume that $dim ker phi_A + dim W = n$.



      1. Give the minimal polynomial for $A$.

      2. Describe all possible Jordan canonical forms for $A$.

      3. Prove that if $A$ is a symmetric matrix, then $W$ is orthogonal to $ker phi_A$. Assume that $mathbbR^n$ is endowed with its standard inner product.

      I don't understand the different between saying that $A$ is a matrix for $[T]_beta$ with respect to some basis $beta$ of $mathbbR^n$ and saying that $phi_A$ is the linear transformation defined by $phi_A(v) = A(v)$.



      In any case, here are the rest of my thoughts: It appears that, given the restriction on the dimensions of $ker phi_A$ and $dim W$ that $A$ is an idempotent matrix. That is, it either fixes or zeros an arbitrary vector $v in mathbbR^n$. It follows that the minimal polynomial is $M_A(x) = x(x-1)$.



      Since the minimal polynomial is a product of distinct linear factors, $A$ is diagonalizable. Thus if $r = dim ker phi_A = dim operatornamenull A$ then there are $r$ 0's on the diagonal and $n-r$ 1's on the diagonal. There are a total of $n-1$ such diagonal matrices, up to permutation of the basis.



      For the last part, if $A$ is a real symmetric matrix, then it is self-adjoint. For any $v in W$ and $u in operatornamenull A$ we have
      $$(u,v) = (Au,v) = (u,A^*v) = (u,Av) = (u,0) = 0 $$
      where the last equality follows since our space is equipped with the standard inner product.









      share|cite|improve this question












      share|cite|improve this question




      share|cite|improve this question








      edited Aug 1 at 18:25









      Bernard

      110k635102




      110k635102









      asked Aug 1 at 18:11









      misogrumpy

      608




      608




















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          1
          down vote



          accepted










          For your first question, if you define a linear map as $phi(v)=phi_A(v)=Av$ for $vinmathbbR^n$ then you essentially define $phi$ to be the linear map with representation $A$ in the standard basis.



          To see this, just note that $phi(e_i)$ is the $i$-th column of $A$ where $e_i$ is the $i$-th standard basis vector.



          This $phi$ is then uniquely determined. You can of course define some $psi$ as the map represented to some other basis $beta$ by $A$. This $psi$ will also be uniquely determined but in general not have the slightest in common with $phi$.




          Note further, that the set $W$, the set of fixed points of $phi$, is essentially the eigenspace for the eigenvalue $1$. With the assertion that $mathrmdim(mathrmker(phi))+mathrmdim(W)=n$ we can infer that



          $$
          mathbbR^n=mathrmker(phi)oplus W
          $$



          To see this, we note that $mathrmker(phi)cap W=mathbf0$, as if $vinmathrmker(phi)cap W$, then $mathbf0=phi(v)=v$. The rest follows from the dimension formula for sums of subspaces:



          $$
          mathrmdim(U_1+U_2)=mathrmdim(U_1)+mathrmdim(U_2)-mathrmdim(U_1cap U_2)
          $$



          Note that this especially means that we can diagonalize $phi$ with value $0$ and $1$ as we may choose a basis $(b_1,dots,b_r)$ for $mathrmker(phi)$ and $(b'_1,dots,b'_n-r)$ for $W$ which then together form a basis $(b_1,dots,b_r,b'_1,dots,b'_n-r)$ for $mathbbR^n$ as $W$ and $mathrmker(phi)$ are in direct sum equal to $mathbbR^n$. This matrix, in my choice of basis, is given by



          $$
          mathrmdiag(0,dots,0,1,dots,1)
          $$



          and is indeed idempotent and thus we identify(as we could have before) $phi$ as a projection of $mathbbR^n$.



          Followingly the minimal polynomial is $X(X-1)$ as you rightly remarked. I just wanted to provide a different viewpoint as how to arrive at the minimal polynomial over diagonalization and direct sums as it fits perfectly into this projection scenario. Note that up to permutation of the basis, this diagonal matrix is its one Jordan Normal Form.



          Note that a projection always has $X(X-1)$ as a minimal polynomial if it is non-trivial and may thus always be diagonalized with such a matrix representation.




          You last remark concerning the orthogonality of $W$ and $mathrmker(phi)$ is perfectly fine.






          share|cite|improve this answer























          • Awesome. What you said about $phi$ being represented by $A$ with respect to the standard basis makes sense. Would it be wrong to say that $ker phi = null A$. Then to notice that since $n = ker phi oplus W$ that the eigenspaces corresponding to the eigenvalues 0 and 1 consist of a basis of eigenvectors (as opposed to generalized eigenspaces). Thus the matrix is diagonalizable. The note about the intersection being trivial is important! I should realized to account for that. Thanks again!
            – misogrumpy
            Aug 1 at 23:29











          • No, $mathrmkerphi$ and $mathrmnullphi$ are just to different ways of writing this space.
            – zzuussee
            Aug 2 at 8:14










          Your Answer




          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "69"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: false,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );








           

          draft saved


          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2869358%2funderstanding-the-linear-transformation-given-by-matrix-multiplication%23new-answer', 'question_page');

          );

          Post as a guest






























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          1
          down vote



          accepted










          For your first question, if you define a linear map as $phi(v)=phi_A(v)=Av$ for $vinmathbbR^n$ then you essentially define $phi$ to be the linear map with representation $A$ in the standard basis.



          To see this, just note that $phi(e_i)$ is the $i$-th column of $A$ where $e_i$ is the $i$-th standard basis vector.



          This $phi$ is then uniquely determined. You can of course define some $psi$ as the map represented to some other basis $beta$ by $A$. This $psi$ will also be uniquely determined but in general not have the slightest in common with $phi$.




          Note further, that the set $W$, the set of fixed points of $phi$, is essentially the eigenspace for the eigenvalue $1$. With the assertion that $mathrmdim(mathrmker(phi))+mathrmdim(W)=n$ we can infer that



          $$
          mathbbR^n=mathrmker(phi)oplus W
          $$



          To see this, we note that $mathrmker(phi)cap W=mathbf0$, as if $vinmathrmker(phi)cap W$, then $mathbf0=phi(v)=v$. The rest follows from the dimension formula for sums of subspaces:



          $$
          mathrmdim(U_1+U_2)=mathrmdim(U_1)+mathrmdim(U_2)-mathrmdim(U_1cap U_2)
          $$



          Note that this especially means that we can diagonalize $phi$ with value $0$ and $1$ as we may choose a basis $(b_1,dots,b_r)$ for $mathrmker(phi)$ and $(b'_1,dots,b'_n-r)$ for $W$ which then together form a basis $(b_1,dots,b_r,b'_1,dots,b'_n-r)$ for $mathbbR^n$ as $W$ and $mathrmker(phi)$ are in direct sum equal to $mathbbR^n$. This matrix, in my choice of basis, is given by



          $$
          mathrmdiag(0,dots,0,1,dots,1)
          $$



          and is indeed idempotent and thus we identify(as we could have before) $phi$ as a projection of $mathbbR^n$.



          Followingly the minimal polynomial is $X(X-1)$ as you rightly remarked. I just wanted to provide a different viewpoint as how to arrive at the minimal polynomial over diagonalization and direct sums as it fits perfectly into this projection scenario. Note that up to permutation of the basis, this diagonal matrix is its one Jordan Normal Form.



          Note that a projection always has $X(X-1)$ as a minimal polynomial if it is non-trivial and may thus always be diagonalized with such a matrix representation.




          You last remark concerning the orthogonality of $W$ and $mathrmker(phi)$ is perfectly fine.






          share|cite|improve this answer























          • Awesome. What you said about $phi$ being represented by $A$ with respect to the standard basis makes sense. Would it be wrong to say that $ker phi = null A$. Then to notice that since $n = ker phi oplus W$ that the eigenspaces corresponding to the eigenvalues 0 and 1 consist of a basis of eigenvectors (as opposed to generalized eigenspaces). Thus the matrix is diagonalizable. The note about the intersection being trivial is important! I should realized to account for that. Thanks again!
            – misogrumpy
            Aug 1 at 23:29











          • No, $mathrmkerphi$ and $mathrmnullphi$ are just to different ways of writing this space.
            – zzuussee
            Aug 2 at 8:14














          up vote
          1
          down vote



          accepted










          For your first question, if you define a linear map as $phi(v)=phi_A(v)=Av$ for $vinmathbbR^n$ then you essentially define $phi$ to be the linear map with representation $A$ in the standard basis.



          To see this, just note that $phi(e_i)$ is the $i$-th column of $A$ where $e_i$ is the $i$-th standard basis vector.



          This $phi$ is then uniquely determined. You can of course define some $psi$ as the map represented to some other basis $beta$ by $A$. This $psi$ will also be uniquely determined but in general not have the slightest in common with $phi$.




          Note further, that the set $W$, the set of fixed points of $phi$, is essentially the eigenspace for the eigenvalue $1$. With the assertion that $mathrmdim(mathrmker(phi))+mathrmdim(W)=n$ we can infer that



          $$
          mathbbR^n=mathrmker(phi)oplus W
          $$



          To see this, we note that $mathrmker(phi)cap W=mathbf0$, as if $vinmathrmker(phi)cap W$, then $mathbf0=phi(v)=v$. The rest follows from the dimension formula for sums of subspaces:



          $$
          mathrmdim(U_1+U_2)=mathrmdim(U_1)+mathrmdim(U_2)-mathrmdim(U_1cap U_2)
          $$



          Note that this especially means that we can diagonalize $phi$ with value $0$ and $1$ as we may choose a basis $(b_1,dots,b_r)$ for $mathrmker(phi)$ and $(b'_1,dots,b'_n-r)$ for $W$ which then together form a basis $(b_1,dots,b_r,b'_1,dots,b'_n-r)$ for $mathbbR^n$ as $W$ and $mathrmker(phi)$ are in direct sum equal to $mathbbR^n$. This matrix, in my choice of basis, is given by



          $$
          mathrmdiag(0,dots,0,1,dots,1)
          $$



          and is indeed idempotent and thus we identify(as we could have before) $phi$ as a projection of $mathbbR^n$.



          Followingly the minimal polynomial is $X(X-1)$ as you rightly remarked. I just wanted to provide a different viewpoint as how to arrive at the minimal polynomial over diagonalization and direct sums as it fits perfectly into this projection scenario. Note that up to permutation of the basis, this diagonal matrix is its one Jordan Normal Form.



          Note that a projection always has $X(X-1)$ as a minimal polynomial if it is non-trivial and may thus always be diagonalized with such a matrix representation.




          You last remark concerning the orthogonality of $W$ and $mathrmker(phi)$ is perfectly fine.






          share|cite|improve this answer























          • Awesome. What you said about $phi$ being represented by $A$ with respect to the standard basis makes sense. Would it be wrong to say that $ker phi = null A$. Then to notice that since $n = ker phi oplus W$ that the eigenspaces corresponding to the eigenvalues 0 and 1 consist of a basis of eigenvectors (as opposed to generalized eigenspaces). Thus the matrix is diagonalizable. The note about the intersection being trivial is important! I should realized to account for that. Thanks again!
            – misogrumpy
            Aug 1 at 23:29











          • No, $mathrmkerphi$ and $mathrmnullphi$ are just to different ways of writing this space.
            – zzuussee
            Aug 2 at 8:14












          up vote
          1
          down vote



          accepted







          up vote
          1
          down vote



          accepted






          For your first question, if you define a linear map as $phi(v)=phi_A(v)=Av$ for $vinmathbbR^n$ then you essentially define $phi$ to be the linear map with representation $A$ in the standard basis.



          To see this, just note that $phi(e_i)$ is the $i$-th column of $A$ where $e_i$ is the $i$-th standard basis vector.



          This $phi$ is then uniquely determined. You can of course define some $psi$ as the map represented to some other basis $beta$ by $A$. This $psi$ will also be uniquely determined but in general not have the slightest in common with $phi$.




          Note further, that the set $W$, the set of fixed points of $phi$, is essentially the eigenspace for the eigenvalue $1$. With the assertion that $mathrmdim(mathrmker(phi))+mathrmdim(W)=n$ we can infer that



          $$
          mathbbR^n=mathrmker(phi)oplus W
          $$



          To see this, we note that $mathrmker(phi)cap W=mathbf0$, as if $vinmathrmker(phi)cap W$, then $mathbf0=phi(v)=v$. The rest follows from the dimension formula for sums of subspaces:



          $$
          mathrmdim(U_1+U_2)=mathrmdim(U_1)+mathrmdim(U_2)-mathrmdim(U_1cap U_2)
          $$



          Note that this especially means that we can diagonalize $phi$ with value $0$ and $1$ as we may choose a basis $(b_1,dots,b_r)$ for $mathrmker(phi)$ and $(b'_1,dots,b'_n-r)$ for $W$ which then together form a basis $(b_1,dots,b_r,b'_1,dots,b'_n-r)$ for $mathbbR^n$ as $W$ and $mathrmker(phi)$ are in direct sum equal to $mathbbR^n$. This matrix, in my choice of basis, is given by



          $$
          mathrmdiag(0,dots,0,1,dots,1)
          $$



          and is indeed idempotent and thus we identify(as we could have before) $phi$ as a projection of $mathbbR^n$.



          Followingly the minimal polynomial is $X(X-1)$ as you rightly remarked. I just wanted to provide a different viewpoint as how to arrive at the minimal polynomial over diagonalization and direct sums as it fits perfectly into this projection scenario. Note that up to permutation of the basis, this diagonal matrix is its one Jordan Normal Form.



          Note that a projection always has $X(X-1)$ as a minimal polynomial if it is non-trivial and may thus always be diagonalized with such a matrix representation.




          You last remark concerning the orthogonality of $W$ and $mathrmker(phi)$ is perfectly fine.






          share|cite|improve this answer















          For your first question, if you define a linear map as $phi(v)=phi_A(v)=Av$ for $vinmathbbR^n$ then you essentially define $phi$ to be the linear map with representation $A$ in the standard basis.



          To see this, just note that $phi(e_i)$ is the $i$-th column of $A$ where $e_i$ is the $i$-th standard basis vector.



          This $phi$ is then uniquely determined. You can of course define some $psi$ as the map represented to some other basis $beta$ by $A$. This $psi$ will also be uniquely determined but in general not have the slightest in common with $phi$.




          Note further, that the set $W$, the set of fixed points of $phi$, is essentially the eigenspace for the eigenvalue $1$. With the assertion that $mathrmdim(mathrmker(phi))+mathrmdim(W)=n$ we can infer that



          $$
          mathbbR^n=mathrmker(phi)oplus W
          $$



          To see this, we note that $mathrmker(phi)cap W=mathbf0$, as if $vinmathrmker(phi)cap W$, then $mathbf0=phi(v)=v$. The rest follows from the dimension formula for sums of subspaces:



          $$
          mathrmdim(U_1+U_2)=mathrmdim(U_1)+mathrmdim(U_2)-mathrmdim(U_1cap U_2)
          $$



          Note that this especially means that we can diagonalize $phi$ with value $0$ and $1$ as we may choose a basis $(b_1,dots,b_r)$ for $mathrmker(phi)$ and $(b'_1,dots,b'_n-r)$ for $W$ which then together form a basis $(b_1,dots,b_r,b'_1,dots,b'_n-r)$ for $mathbbR^n$ as $W$ and $mathrmker(phi)$ are in direct sum equal to $mathbbR^n$. This matrix, in my choice of basis, is given by



          $$
          mathrmdiag(0,dots,0,1,dots,1)
          $$



          and is indeed idempotent and thus we identify(as we could have before) $phi$ as a projection of $mathbbR^n$.



          Followingly the minimal polynomial is $X(X-1)$ as you rightly remarked. I just wanted to provide a different viewpoint as how to arrive at the minimal polynomial over diagonalization and direct sums as it fits perfectly into this projection scenario. Note that up to permutation of the basis, this diagonal matrix is its one Jordan Normal Form.



          Note that a projection always has $X(X-1)$ as a minimal polynomial if it is non-trivial and may thus always be diagonalized with such a matrix representation.




          You last remark concerning the orthogonality of $W$ and $mathrmker(phi)$ is perfectly fine.







          share|cite|improve this answer















          share|cite|improve this answer



          share|cite|improve this answer








          edited Aug 1 at 19:15


























          answered Aug 1 at 19:07









          zzuussee

          1,152419




          1,152419











          • Awesome. What you said about $phi$ being represented by $A$ with respect to the standard basis makes sense. Would it be wrong to say that $ker phi = null A$. Then to notice that since $n = ker phi oplus W$ that the eigenspaces corresponding to the eigenvalues 0 and 1 consist of a basis of eigenvectors (as opposed to generalized eigenspaces). Thus the matrix is diagonalizable. The note about the intersection being trivial is important! I should realized to account for that. Thanks again!
            – misogrumpy
            Aug 1 at 23:29











          • No, $mathrmkerphi$ and $mathrmnullphi$ are just to different ways of writing this space.
            – zzuussee
            Aug 2 at 8:14
















          • Awesome. What you said about $phi$ being represented by $A$ with respect to the standard basis makes sense. Would it be wrong to say that $ker phi = null A$. Then to notice that since $n = ker phi oplus W$ that the eigenspaces corresponding to the eigenvalues 0 and 1 consist of a basis of eigenvectors (as opposed to generalized eigenspaces). Thus the matrix is diagonalizable. The note about the intersection being trivial is important! I should realized to account for that. Thanks again!
            – misogrumpy
            Aug 1 at 23:29











          • No, $mathrmkerphi$ and $mathrmnullphi$ are just to different ways of writing this space.
            – zzuussee
            Aug 2 at 8:14















          Awesome. What you said about $phi$ being represented by $A$ with respect to the standard basis makes sense. Would it be wrong to say that $ker phi = null A$. Then to notice that since $n = ker phi oplus W$ that the eigenspaces corresponding to the eigenvalues 0 and 1 consist of a basis of eigenvectors (as opposed to generalized eigenspaces). Thus the matrix is diagonalizable. The note about the intersection being trivial is important! I should realized to account for that. Thanks again!
          – misogrumpy
          Aug 1 at 23:29





          Awesome. What you said about $phi$ being represented by $A$ with respect to the standard basis makes sense. Would it be wrong to say that $ker phi = null A$. Then to notice that since $n = ker phi oplus W$ that the eigenspaces corresponding to the eigenvalues 0 and 1 consist of a basis of eigenvectors (as opposed to generalized eigenspaces). Thus the matrix is diagonalizable. The note about the intersection being trivial is important! I should realized to account for that. Thanks again!
          – misogrumpy
          Aug 1 at 23:29













          No, $mathrmkerphi$ and $mathrmnullphi$ are just to different ways of writing this space.
          – zzuussee
          Aug 2 at 8:14




          No, $mathrmkerphi$ and $mathrmnullphi$ are just to different ways of writing this space.
          – zzuussee
          Aug 2 at 8:14












           

          draft saved


          draft discarded


























           


          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2869358%2funderstanding-the-linear-transformation-given-by-matrix-multiplication%23new-answer', 'question_page');

          );

          Post as a guest













































































          Comments

          Popular posts from this blog

          What is the equation of a 3D cone with generalised tilt?

          Color the edges and diagonals of a regular polygon

          Relationship between determinant of matrix and determinant of adjoint?