Prove that there is only a single eigenvector corresponding to each of the distinct eigenvalue

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












Prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then



1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues and



2) that $bar c_1,bar c_2, dots,bar c_n$ are all linearly independent.







share|cite|improve this question























    up vote
    1
    down vote

    favorite












    Prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then



    1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues and



    2) that $bar c_1,bar c_2, dots,bar c_n$ are all linearly independent.







    share|cite|improve this question





















      up vote
      1
      down vote

      favorite









      up vote
      1
      down vote

      favorite











      Prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then



      1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues and



      2) that $bar c_1,bar c_2, dots,bar c_n$ are all linearly independent.







      share|cite|improve this question











      Prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then



      1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues and



      2) that $bar c_1,bar c_2, dots,bar c_n$ are all linearly independent.









      share|cite|improve this question










      share|cite|improve this question




      share|cite|improve this question









      asked Jul 21 at 20:00









      Aditya

      247314




      247314




















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          1
          down vote













          I will seek to prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then 1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues




          Definition: the algebraic multiplicity of an eigenvalue $a(lambda_i)$ is the power to which $(λ – lambda_i)$ divides the
          characteristic polynomial.




          So if a eigenvalue $lambda_j$ repeats $k$ times then $a(lambda_j) = k$.



          So in the case of distinct eigenvalues $lambda_1,ldots,lambda_n$, we can conclude that for each of the $n$ eigenvalues, the algebraic multiplicity is 1.



          $$a(lambda_i) = 1 space space space forall space spacelambda_i$$



          Now what will the geometric multiplicities of these eigenvalues be?




          Definition: The geometric multiplicity of an eigenvalue, $g(λ_i)$ is the dimension of the eigenspace $E_λ_i = N(A−λ_iI)$ corresponding to $λ_i$. Here N finds the null space of $A−λ_iI$.




          The eigenspace $E_λ_i$ is best understood as the vector space spanned by all the eigenvectors that correspond to the eigenvalue $λ_i$, i.e. the collection of all vectors $bar v$ that satisfy $Abar v = λ_ibar v$ form the eigenspace.



          An eigenspace has dimension greater than zero by definition. Since the definition of an eigenvalue is : λ is an eigenvalue of A if $Ax=λx$ for some $x≠0$. Since only the zero vector by itself has a dimension of zero,we can conclude $0 < g(λ_i)$ or $1 ≤ g(λ_i)$.



          This is a brilliant proof for why $g(λ_i)≤a(λ_i)$, it shows that the characteristic polynomial will have $(lambda - λ_i)^g(λ_i)$ at least as a factor.



          $$1 ≤ g(λ_i) ≤a(λ_i)$$



          Thus in the case of distinct eigenvalues,



          $$1 ≤ g(λ_i) ≤ 1 space space space forall space spacelambda_i$$



          $$g(λ_i) = 1 space space space forall space spacelambda_i$$



          $g(λ_i)$ is also equivalently the number of independent eigenvectors associated with $λ_i$, since if we want to span a vector space (the eigenspace) of dimension $g(λ_i)$, we need that many independent eigenvectors.



          Thus we have proved that associated with each distinct eigenvalue is a single eigenvector.






          share|cite|improve this answer

















          • 1




            Answering your own question is acceptable here. But that usually happens only if when you first asked you didn't know an answer, and you only found one after quite a while. Here you seem to have posted your answer right after asking. Why? If you are unsure of your proof you can put the proof in your question, flag the place where you have doubts, and use the proof-verification tag.
            – Ethan Bolker
            Jul 21 at 20:04










          • I only answered part 1, I actually wanted to answer this question - math.stackexchange.com/questions/29371/… using the method I gave above. That is why I posted half the solution. I hope to develop the answer later and provide a geometric intuition for part 2 of the question.
            – Aditya
            Jul 21 at 20:11











          • That question is seven years old and already has several really nice answers.
            – Ethan Bolker
            Jul 21 at 20:25










          • Oh, let me go through them again then. I didn't manage to grasp the inution for why distinct eigenspaces imply independence while not the converse. While the solutions are algebraically elegant. I wanted to visualise something like how each of the n eigenspaces are restricted to a line, we should be able to say that all n Eigenvectors are linearly independent, since together they should span $F^n$?
            – Aditya
            Jul 21 at 20:40










          Your Answer




          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "69"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: false,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );








           

          draft saved


          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2858839%2fprove-that-there-is-only-a-single-eigenvector-corresponding-to-each-of-the-disti%23new-answer', 'question_page');

          );

          Post as a guest






























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          1
          down vote













          I will seek to prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then 1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues




          Definition: the algebraic multiplicity of an eigenvalue $a(lambda_i)$ is the power to which $(λ – lambda_i)$ divides the
          characteristic polynomial.




          So if a eigenvalue $lambda_j$ repeats $k$ times then $a(lambda_j) = k$.



          So in the case of distinct eigenvalues $lambda_1,ldots,lambda_n$, we can conclude that for each of the $n$ eigenvalues, the algebraic multiplicity is 1.



          $$a(lambda_i) = 1 space space space forall space spacelambda_i$$



          Now what will the geometric multiplicities of these eigenvalues be?




          Definition: The geometric multiplicity of an eigenvalue, $g(λ_i)$ is the dimension of the eigenspace $E_λ_i = N(A−λ_iI)$ corresponding to $λ_i$. Here N finds the null space of $A−λ_iI$.




          The eigenspace $E_λ_i$ is best understood as the vector space spanned by all the eigenvectors that correspond to the eigenvalue $λ_i$, i.e. the collection of all vectors $bar v$ that satisfy $Abar v = λ_ibar v$ form the eigenspace.



          An eigenspace has dimension greater than zero by definition. Since the definition of an eigenvalue is : λ is an eigenvalue of A if $Ax=λx$ for some $x≠0$. Since only the zero vector by itself has a dimension of zero,we can conclude $0 < g(λ_i)$ or $1 ≤ g(λ_i)$.



          This is a brilliant proof for why $g(λ_i)≤a(λ_i)$, it shows that the characteristic polynomial will have $(lambda - λ_i)^g(λ_i)$ at least as a factor.



          $$1 ≤ g(λ_i) ≤a(λ_i)$$



          Thus in the case of distinct eigenvalues,



          $$1 ≤ g(λ_i) ≤ 1 space space space forall space spacelambda_i$$



          $$g(λ_i) = 1 space space space forall space spacelambda_i$$



          $g(λ_i)$ is also equivalently the number of independent eigenvectors associated with $λ_i$, since if we want to span a vector space (the eigenspace) of dimension $g(λ_i)$, we need that many independent eigenvectors.



          Thus we have proved that associated with each distinct eigenvalue is a single eigenvector.






          share|cite|improve this answer

















          • 1




            Answering your own question is acceptable here. But that usually happens only if when you first asked you didn't know an answer, and you only found one after quite a while. Here you seem to have posted your answer right after asking. Why? If you are unsure of your proof you can put the proof in your question, flag the place where you have doubts, and use the proof-verification tag.
            – Ethan Bolker
            Jul 21 at 20:04










          • I only answered part 1, I actually wanted to answer this question - math.stackexchange.com/questions/29371/… using the method I gave above. That is why I posted half the solution. I hope to develop the answer later and provide a geometric intuition for part 2 of the question.
            – Aditya
            Jul 21 at 20:11











          • That question is seven years old and already has several really nice answers.
            – Ethan Bolker
            Jul 21 at 20:25










          • Oh, let me go through them again then. I didn't manage to grasp the inution for why distinct eigenspaces imply independence while not the converse. While the solutions are algebraically elegant. I wanted to visualise something like how each of the n eigenspaces are restricted to a line, we should be able to say that all n Eigenvectors are linearly independent, since together they should span $F^n$?
            – Aditya
            Jul 21 at 20:40














          up vote
          1
          down vote













          I will seek to prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then 1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues




          Definition: the algebraic multiplicity of an eigenvalue $a(lambda_i)$ is the power to which $(λ – lambda_i)$ divides the
          characteristic polynomial.




          So if a eigenvalue $lambda_j$ repeats $k$ times then $a(lambda_j) = k$.



          So in the case of distinct eigenvalues $lambda_1,ldots,lambda_n$, we can conclude that for each of the $n$ eigenvalues, the algebraic multiplicity is 1.



          $$a(lambda_i) = 1 space space space forall space spacelambda_i$$



          Now what will the geometric multiplicities of these eigenvalues be?




          Definition: The geometric multiplicity of an eigenvalue, $g(λ_i)$ is the dimension of the eigenspace $E_λ_i = N(A−λ_iI)$ corresponding to $λ_i$. Here N finds the null space of $A−λ_iI$.




          The eigenspace $E_λ_i$ is best understood as the vector space spanned by all the eigenvectors that correspond to the eigenvalue $λ_i$, i.e. the collection of all vectors $bar v$ that satisfy $Abar v = λ_ibar v$ form the eigenspace.



          An eigenspace has dimension greater than zero by definition. Since the definition of an eigenvalue is : λ is an eigenvalue of A if $Ax=λx$ for some $x≠0$. Since only the zero vector by itself has a dimension of zero,we can conclude $0 < g(λ_i)$ or $1 ≤ g(λ_i)$.



          This is a brilliant proof for why $g(λ_i)≤a(λ_i)$, it shows that the characteristic polynomial will have $(lambda - λ_i)^g(λ_i)$ at least as a factor.



          $$1 ≤ g(λ_i) ≤a(λ_i)$$



          Thus in the case of distinct eigenvalues,



          $$1 ≤ g(λ_i) ≤ 1 space space space forall space spacelambda_i$$



          $$g(λ_i) = 1 space space space forall space spacelambda_i$$



          $g(λ_i)$ is also equivalently the number of independent eigenvectors associated with $λ_i$, since if we want to span a vector space (the eigenspace) of dimension $g(λ_i)$, we need that many independent eigenvectors.



          Thus we have proved that associated with each distinct eigenvalue is a single eigenvector.






          share|cite|improve this answer

















          • 1




            Answering your own question is acceptable here. But that usually happens only if when you first asked you didn't know an answer, and you only found one after quite a while. Here you seem to have posted your answer right after asking. Why? If you are unsure of your proof you can put the proof in your question, flag the place where you have doubts, and use the proof-verification tag.
            – Ethan Bolker
            Jul 21 at 20:04










          • I only answered part 1, I actually wanted to answer this question - math.stackexchange.com/questions/29371/… using the method I gave above. That is why I posted half the solution. I hope to develop the answer later and provide a geometric intuition for part 2 of the question.
            – Aditya
            Jul 21 at 20:11











          • That question is seven years old and already has several really nice answers.
            – Ethan Bolker
            Jul 21 at 20:25










          • Oh, let me go through them again then. I didn't manage to grasp the inution for why distinct eigenspaces imply independence while not the converse. While the solutions are algebraically elegant. I wanted to visualise something like how each of the n eigenspaces are restricted to a line, we should be able to say that all n Eigenvectors are linearly independent, since together they should span $F^n$?
            – Aditya
            Jul 21 at 20:40












          up vote
          1
          down vote










          up vote
          1
          down vote









          I will seek to prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then 1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues




          Definition: the algebraic multiplicity of an eigenvalue $a(lambda_i)$ is the power to which $(λ – lambda_i)$ divides the
          characteristic polynomial.




          So if a eigenvalue $lambda_j$ repeats $k$ times then $a(lambda_j) = k$.



          So in the case of distinct eigenvalues $lambda_1,ldots,lambda_n$, we can conclude that for each of the $n$ eigenvalues, the algebraic multiplicity is 1.



          $$a(lambda_i) = 1 space space space forall space spacelambda_i$$



          Now what will the geometric multiplicities of these eigenvalues be?




          Definition: The geometric multiplicity of an eigenvalue, $g(λ_i)$ is the dimension of the eigenspace $E_λ_i = N(A−λ_iI)$ corresponding to $λ_i$. Here N finds the null space of $A−λ_iI$.




          The eigenspace $E_λ_i$ is best understood as the vector space spanned by all the eigenvectors that correspond to the eigenvalue $λ_i$, i.e. the collection of all vectors $bar v$ that satisfy $Abar v = λ_ibar v$ form the eigenspace.



          An eigenspace has dimension greater than zero by definition. Since the definition of an eigenvalue is : λ is an eigenvalue of A if $Ax=λx$ for some $x≠0$. Since only the zero vector by itself has a dimension of zero,we can conclude $0 < g(λ_i)$ or $1 ≤ g(λ_i)$.



          This is a brilliant proof for why $g(λ_i)≤a(λ_i)$, it shows that the characteristic polynomial will have $(lambda - λ_i)^g(λ_i)$ at least as a factor.



          $$1 ≤ g(λ_i) ≤a(λ_i)$$



          Thus in the case of distinct eigenvalues,



          $$1 ≤ g(λ_i) ≤ 1 space space space forall space spacelambda_i$$



          $$g(λ_i) = 1 space space space forall space spacelambda_i$$



          $g(λ_i)$ is also equivalently the number of independent eigenvectors associated with $λ_i$, since if we want to span a vector space (the eigenspace) of dimension $g(λ_i)$, we need that many independent eigenvectors.



          Thus we have proved that associated with each distinct eigenvalue is a single eigenvector.






          share|cite|improve this answer













          I will seek to prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then 1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues




          Definition: the algebraic multiplicity of an eigenvalue $a(lambda_i)$ is the power to which $(λ – lambda_i)$ divides the
          characteristic polynomial.




          So if a eigenvalue $lambda_j$ repeats $k$ times then $a(lambda_j) = k$.



          So in the case of distinct eigenvalues $lambda_1,ldots,lambda_n$, we can conclude that for each of the $n$ eigenvalues, the algebraic multiplicity is 1.



          $$a(lambda_i) = 1 space space space forall space spacelambda_i$$



          Now what will the geometric multiplicities of these eigenvalues be?




          Definition: The geometric multiplicity of an eigenvalue, $g(λ_i)$ is the dimension of the eigenspace $E_λ_i = N(A−λ_iI)$ corresponding to $λ_i$. Here N finds the null space of $A−λ_iI$.




          The eigenspace $E_λ_i$ is best understood as the vector space spanned by all the eigenvectors that correspond to the eigenvalue $λ_i$, i.e. the collection of all vectors $bar v$ that satisfy $Abar v = λ_ibar v$ form the eigenspace.



          An eigenspace has dimension greater than zero by definition. Since the definition of an eigenvalue is : λ is an eigenvalue of A if $Ax=λx$ for some $x≠0$. Since only the zero vector by itself has a dimension of zero,we can conclude $0 < g(λ_i)$ or $1 ≤ g(λ_i)$.



          This is a brilliant proof for why $g(λ_i)≤a(λ_i)$, it shows that the characteristic polynomial will have $(lambda - λ_i)^g(λ_i)$ at least as a factor.



          $$1 ≤ g(λ_i) ≤a(λ_i)$$



          Thus in the case of distinct eigenvalues,



          $$1 ≤ g(λ_i) ≤ 1 space space space forall space spacelambda_i$$



          $$g(λ_i) = 1 space space space forall space spacelambda_i$$



          $g(λ_i)$ is also equivalently the number of independent eigenvectors associated with $λ_i$, since if we want to span a vector space (the eigenspace) of dimension $g(λ_i)$, we need that many independent eigenvectors.



          Thus we have proved that associated with each distinct eigenvalue is a single eigenvector.







          share|cite|improve this answer













          share|cite|improve this answer



          share|cite|improve this answer











          answered Jul 21 at 20:00









          Aditya

          247314




          247314







          • 1




            Answering your own question is acceptable here. But that usually happens only if when you first asked you didn't know an answer, and you only found one after quite a while. Here you seem to have posted your answer right after asking. Why? If you are unsure of your proof you can put the proof in your question, flag the place where you have doubts, and use the proof-verification tag.
            – Ethan Bolker
            Jul 21 at 20:04










          • I only answered part 1, I actually wanted to answer this question - math.stackexchange.com/questions/29371/… using the method I gave above. That is why I posted half the solution. I hope to develop the answer later and provide a geometric intuition for part 2 of the question.
            – Aditya
            Jul 21 at 20:11











          • That question is seven years old and already has several really nice answers.
            – Ethan Bolker
            Jul 21 at 20:25










          • Oh, let me go through them again then. I didn't manage to grasp the inution for why distinct eigenspaces imply independence while not the converse. While the solutions are algebraically elegant. I wanted to visualise something like how each of the n eigenspaces are restricted to a line, we should be able to say that all n Eigenvectors are linearly independent, since together they should span $F^n$?
            – Aditya
            Jul 21 at 20:40












          • 1




            Answering your own question is acceptable here. But that usually happens only if when you first asked you didn't know an answer, and you only found one after quite a while. Here you seem to have posted your answer right after asking. Why? If you are unsure of your proof you can put the proof in your question, flag the place where you have doubts, and use the proof-verification tag.
            – Ethan Bolker
            Jul 21 at 20:04










          • I only answered part 1, I actually wanted to answer this question - math.stackexchange.com/questions/29371/… using the method I gave above. That is why I posted half the solution. I hope to develop the answer later and provide a geometric intuition for part 2 of the question.
            – Aditya
            Jul 21 at 20:11











          • That question is seven years old and already has several really nice answers.
            – Ethan Bolker
            Jul 21 at 20:25










          • Oh, let me go through them again then. I didn't manage to grasp the inution for why distinct eigenspaces imply independence while not the converse. While the solutions are algebraically elegant. I wanted to visualise something like how each of the n eigenspaces are restricted to a line, we should be able to say that all n Eigenvectors are linearly independent, since together they should span $F^n$?
            – Aditya
            Jul 21 at 20:40







          1




          1




          Answering your own question is acceptable here. But that usually happens only if when you first asked you didn't know an answer, and you only found one after quite a while. Here you seem to have posted your answer right after asking. Why? If you are unsure of your proof you can put the proof in your question, flag the place where you have doubts, and use the proof-verification tag.
          – Ethan Bolker
          Jul 21 at 20:04




          Answering your own question is acceptable here. But that usually happens only if when you first asked you didn't know an answer, and you only found one after quite a while. Here you seem to have posted your answer right after asking. Why? If you are unsure of your proof you can put the proof in your question, flag the place where you have doubts, and use the proof-verification tag.
          – Ethan Bolker
          Jul 21 at 20:04












          I only answered part 1, I actually wanted to answer this question - math.stackexchange.com/questions/29371/… using the method I gave above. That is why I posted half the solution. I hope to develop the answer later and provide a geometric intuition for part 2 of the question.
          – Aditya
          Jul 21 at 20:11





          I only answered part 1, I actually wanted to answer this question - math.stackexchange.com/questions/29371/… using the method I gave above. That is why I posted half the solution. I hope to develop the answer later and provide a geometric intuition for part 2 of the question.
          – Aditya
          Jul 21 at 20:11













          That question is seven years old and already has several really nice answers.
          – Ethan Bolker
          Jul 21 at 20:25




          That question is seven years old and already has several really nice answers.
          – Ethan Bolker
          Jul 21 at 20:25












          Oh, let me go through them again then. I didn't manage to grasp the inution for why distinct eigenspaces imply independence while not the converse. While the solutions are algebraically elegant. I wanted to visualise something like how each of the n eigenspaces are restricted to a line, we should be able to say that all n Eigenvectors are linearly independent, since together they should span $F^n$?
          – Aditya
          Jul 21 at 20:40




          Oh, let me go through them again then. I didn't manage to grasp the inution for why distinct eigenspaces imply independence while not the converse. While the solutions are algebraically elegant. I wanted to visualise something like how each of the n eigenspaces are restricted to a line, we should be able to say that all n Eigenvectors are linearly independent, since together they should span $F^n$?
          – Aditya
          Jul 21 at 20:40












           

          draft saved


          draft discarded


























           


          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2858839%2fprove-that-there-is-only-a-single-eigenvector-corresponding-to-each-of-the-disti%23new-answer', 'question_page');

          );

          Post as a guest













































































          Comments

          Popular posts from this blog

          What is the equation of a 3D cone with generalised tilt?

          Color the edges and diagonals of a regular polygon

          Relationship between determinant of matrix and determinant of adjoint?