Checking Invertibility of Square Matrix

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












The question is:



Which statement is the same as asserting a square matrix $ZinmathbbR^(n,n)$ is invertible.



  • That the columns of $Z$ span $mathbbR^n$.

  • That $Z$ has no eigenvectors.

  • That $Z$ has no eigenvectors of eigenvalue $0$.

  • $mathrmkerZ = mathbf0$

D is true from the rank-nullity theorem and properties of linear maps.
C is true since $0$ as an eigenvalue is equivalent to noninvertible.
B is false by extension of C



I am unsure of A.







share|cite|improve this question





















  • please take the time to format properly the question using either mathjax or latex
    – Ahmad Bazzi
    Aug 3 at 13:18










  • @AhmadBazzi is there a tutorial on using mathjax or latex?
    – PERTURBATIONFLOW
    Aug 3 at 13:18










  • and also is it statement or statement(s) ?
    – Ahmad Bazzi
    Aug 3 at 13:21










  • @AhmadBazzi it's a group of individual statements ordered A, B, C, D from top to the bottom.
    – PERTURBATIONFLOW
    Aug 3 at 13:23










  • math formatting tutorial/reference
    – Omnomnomnom
    Aug 3 at 13:26














up vote
1
down vote

favorite












The question is:



Which statement is the same as asserting a square matrix $ZinmathbbR^(n,n)$ is invertible.



  • That the columns of $Z$ span $mathbbR^n$.

  • That $Z$ has no eigenvectors.

  • That $Z$ has no eigenvectors of eigenvalue $0$.

  • $mathrmkerZ = mathbf0$

D is true from the rank-nullity theorem and properties of linear maps.
C is true since $0$ as an eigenvalue is equivalent to noninvertible.
B is false by extension of C



I am unsure of A.







share|cite|improve this question





















  • please take the time to format properly the question using either mathjax or latex
    – Ahmad Bazzi
    Aug 3 at 13:18










  • @AhmadBazzi is there a tutorial on using mathjax or latex?
    – PERTURBATIONFLOW
    Aug 3 at 13:18










  • and also is it statement or statement(s) ?
    – Ahmad Bazzi
    Aug 3 at 13:21










  • @AhmadBazzi it's a group of individual statements ordered A, B, C, D from top to the bottom.
    – PERTURBATIONFLOW
    Aug 3 at 13:23










  • math formatting tutorial/reference
    – Omnomnomnom
    Aug 3 at 13:26












up vote
1
down vote

favorite









up vote
1
down vote

favorite











The question is:



Which statement is the same as asserting a square matrix $ZinmathbbR^(n,n)$ is invertible.



  • That the columns of $Z$ span $mathbbR^n$.

  • That $Z$ has no eigenvectors.

  • That $Z$ has no eigenvectors of eigenvalue $0$.

  • $mathrmkerZ = mathbf0$

D is true from the rank-nullity theorem and properties of linear maps.
C is true since $0$ as an eigenvalue is equivalent to noninvertible.
B is false by extension of C



I am unsure of A.







share|cite|improve this question













The question is:



Which statement is the same as asserting a square matrix $ZinmathbbR^(n,n)$ is invertible.



  • That the columns of $Z$ span $mathbbR^n$.

  • That $Z$ has no eigenvectors.

  • That $Z$ has no eigenvectors of eigenvalue $0$.

  • $mathrmkerZ = mathbf0$

D is true from the rank-nullity theorem and properties of linear maps.
C is true since $0$ as an eigenvalue is equivalent to noninvertible.
B is false by extension of C



I am unsure of A.









share|cite|improve this question












share|cite|improve this question




share|cite|improve this question








edited Aug 3 at 13:36









zzuussee

1,101419




1,101419









asked Aug 3 at 13:15









PERTURBATIONFLOW

396




396











  • please take the time to format properly the question using either mathjax or latex
    – Ahmad Bazzi
    Aug 3 at 13:18










  • @AhmadBazzi is there a tutorial on using mathjax or latex?
    – PERTURBATIONFLOW
    Aug 3 at 13:18










  • and also is it statement or statement(s) ?
    – Ahmad Bazzi
    Aug 3 at 13:21










  • @AhmadBazzi it's a group of individual statements ordered A, B, C, D from top to the bottom.
    – PERTURBATIONFLOW
    Aug 3 at 13:23










  • math formatting tutorial/reference
    – Omnomnomnom
    Aug 3 at 13:26
















  • please take the time to format properly the question using either mathjax or latex
    – Ahmad Bazzi
    Aug 3 at 13:18










  • @AhmadBazzi is there a tutorial on using mathjax or latex?
    – PERTURBATIONFLOW
    Aug 3 at 13:18










  • and also is it statement or statement(s) ?
    – Ahmad Bazzi
    Aug 3 at 13:21










  • @AhmadBazzi it's a group of individual statements ordered A, B, C, D from top to the bottom.
    – PERTURBATIONFLOW
    Aug 3 at 13:23










  • math formatting tutorial/reference
    – Omnomnomnom
    Aug 3 at 13:26















please take the time to format properly the question using either mathjax or latex
– Ahmad Bazzi
Aug 3 at 13:18




please take the time to format properly the question using either mathjax or latex
– Ahmad Bazzi
Aug 3 at 13:18












@AhmadBazzi is there a tutorial on using mathjax or latex?
– PERTURBATIONFLOW
Aug 3 at 13:18




@AhmadBazzi is there a tutorial on using mathjax or latex?
– PERTURBATIONFLOW
Aug 3 at 13:18












and also is it statement or statement(s) ?
– Ahmad Bazzi
Aug 3 at 13:21




and also is it statement or statement(s) ?
– Ahmad Bazzi
Aug 3 at 13:21












@AhmadBazzi it's a group of individual statements ordered A, B, C, D from top to the bottom.
– PERTURBATIONFLOW
Aug 3 at 13:23




@AhmadBazzi it's a group of individual statements ordered A, B, C, D from top to the bottom.
– PERTURBATIONFLOW
Aug 3 at 13:23












math formatting tutorial/reference
– Omnomnomnom
Aug 3 at 13:26




math formatting tutorial/reference
– Omnomnomnom
Aug 3 at 13:26










4 Answers
4






active

oldest

votes

















up vote
1
down vote



accepted










You're right about your assertions. Let me just give some notes on them as well:




For (b), you might think about the identity, certainly invertible but with every vector an eigenvector of $1$(besides $mathbf0$ as the null-vector never is an eigenvector).



For (c), as you rightly remarked, you have that for a matrix $Z$, $0$ is an eigenvalue of $Z$ iff $p_Z(0)=0$ iff $det(Z-0E_n)=0$ iff $det(Z)=0$ iff $Z$ is singular, i.e. not invertible.



For (d), the rank-nullity theorem seems to be the best way to go.




Now for (a), note that for $Z=(z_1,dots,z_n)$ being the column representation of $Z$, we have that if $z_1,dots,z_n$ span $mathbbR^n$, then f.a $vinmathbbR^n$, there exists $x_1,dots,x_ninmathbbR$ s.t. $sum_i=1^nx_iz_i=v$. Thus, using matrix multiplication, for every $vinmathbbR^n$, there exists an $xinmathbbR^n$ s.t. $Zx=v$, as $Zx=sum_i=1^nx_iz_i$ for $x=(x_1,dots,x_n)$.



Thus, the linear map $phi_Z(x)=Zx$ for $xinmathbbR^n$ is surjective. Now, by the rank-nullity theorem, a linear map between two spaces with the same dimension (as here $mathbbR^n$) is surjective iff it is injective iff it is bijective. Note that this only holds in finite dimensions. Thus $phi_Z$ is bijective and thus $Z$ is invertible.




To round this off, in your case of the finite dimensional $mathbbR^n$ you have $Z$ is invertible iff the columns of $Z$ span $mathbbR^n$ iff the columns of $Z$ are linearly independent. The last of which would correspond by a similar argument as above to injectivity(check this yourself) and thus we'd apply the argument again that this implies bijectivity rightout.






share|cite|improve this answer




























    up vote
    1
    down vote













    A is true. $mathbb R^n$ is an $n$-dimensional vector space, so if $n$ vectors span $mathbb R^n$ then these $n$ vectors are a basis of $mathbb R^n$. Hence the columns of the matrix $Z$ span $mathbb R^n$ iff the columns are linearly independent, iff the system of equations $Zx=0$ has only the trivial solution, iff $Z$ is invertible.






    share|cite|improve this answer




























      up vote
      1
      down vote













      We have




      • That the columns of Z span R^n



      Yes indeed in that case the matrix is full rank and invertible.




      • That Z has no eigenvectors



      This point does not make sense.




      • That Z has no eigenvectors of eigenvalue = 0



      In that case A is not invertible since $det(A)=prod(lambda_i)=0$




      • Kernel of Z = 0



      Yes indeed that implies that columns are independent ($A$ is full rank).






      share|cite|improve this answer




























        up vote
        0
        down vote













        The statements follow the Invertible Matrix Theorem.. The 1st, 3rd and 4th all work but the 2nd doesn't make much sense.



        A is true because if the columns span $mathbbR^n$ they are linearly independent then invertible






        share|cite|improve this answer





















          Your Answer




          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "69"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: false,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );








           

          draft saved


          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2871047%2fchecking-invertibility-of-square-matrix%23new-answer', 'question_page');

          );

          Post as a guest






























          4 Answers
          4






          active

          oldest

          votes








          4 Answers
          4






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          1
          down vote



          accepted










          You're right about your assertions. Let me just give some notes on them as well:




          For (b), you might think about the identity, certainly invertible but with every vector an eigenvector of $1$(besides $mathbf0$ as the null-vector never is an eigenvector).



          For (c), as you rightly remarked, you have that for a matrix $Z$, $0$ is an eigenvalue of $Z$ iff $p_Z(0)=0$ iff $det(Z-0E_n)=0$ iff $det(Z)=0$ iff $Z$ is singular, i.e. not invertible.



          For (d), the rank-nullity theorem seems to be the best way to go.




          Now for (a), note that for $Z=(z_1,dots,z_n)$ being the column representation of $Z$, we have that if $z_1,dots,z_n$ span $mathbbR^n$, then f.a $vinmathbbR^n$, there exists $x_1,dots,x_ninmathbbR$ s.t. $sum_i=1^nx_iz_i=v$. Thus, using matrix multiplication, for every $vinmathbbR^n$, there exists an $xinmathbbR^n$ s.t. $Zx=v$, as $Zx=sum_i=1^nx_iz_i$ for $x=(x_1,dots,x_n)$.



          Thus, the linear map $phi_Z(x)=Zx$ for $xinmathbbR^n$ is surjective. Now, by the rank-nullity theorem, a linear map between two spaces with the same dimension (as here $mathbbR^n$) is surjective iff it is injective iff it is bijective. Note that this only holds in finite dimensions. Thus $phi_Z$ is bijective and thus $Z$ is invertible.




          To round this off, in your case of the finite dimensional $mathbbR^n$ you have $Z$ is invertible iff the columns of $Z$ span $mathbbR^n$ iff the columns of $Z$ are linearly independent. The last of which would correspond by a similar argument as above to injectivity(check this yourself) and thus we'd apply the argument again that this implies bijectivity rightout.






          share|cite|improve this answer

























            up vote
            1
            down vote



            accepted










            You're right about your assertions. Let me just give some notes on them as well:




            For (b), you might think about the identity, certainly invertible but with every vector an eigenvector of $1$(besides $mathbf0$ as the null-vector never is an eigenvector).



            For (c), as you rightly remarked, you have that for a matrix $Z$, $0$ is an eigenvalue of $Z$ iff $p_Z(0)=0$ iff $det(Z-0E_n)=0$ iff $det(Z)=0$ iff $Z$ is singular, i.e. not invertible.



            For (d), the rank-nullity theorem seems to be the best way to go.




            Now for (a), note that for $Z=(z_1,dots,z_n)$ being the column representation of $Z$, we have that if $z_1,dots,z_n$ span $mathbbR^n$, then f.a $vinmathbbR^n$, there exists $x_1,dots,x_ninmathbbR$ s.t. $sum_i=1^nx_iz_i=v$. Thus, using matrix multiplication, for every $vinmathbbR^n$, there exists an $xinmathbbR^n$ s.t. $Zx=v$, as $Zx=sum_i=1^nx_iz_i$ for $x=(x_1,dots,x_n)$.



            Thus, the linear map $phi_Z(x)=Zx$ for $xinmathbbR^n$ is surjective. Now, by the rank-nullity theorem, a linear map between two spaces with the same dimension (as here $mathbbR^n$) is surjective iff it is injective iff it is bijective. Note that this only holds in finite dimensions. Thus $phi_Z$ is bijective and thus $Z$ is invertible.




            To round this off, in your case of the finite dimensional $mathbbR^n$ you have $Z$ is invertible iff the columns of $Z$ span $mathbbR^n$ iff the columns of $Z$ are linearly independent. The last of which would correspond by a similar argument as above to injectivity(check this yourself) and thus we'd apply the argument again that this implies bijectivity rightout.






            share|cite|improve this answer























              up vote
              1
              down vote



              accepted







              up vote
              1
              down vote



              accepted






              You're right about your assertions. Let me just give some notes on them as well:




              For (b), you might think about the identity, certainly invertible but with every vector an eigenvector of $1$(besides $mathbf0$ as the null-vector never is an eigenvector).



              For (c), as you rightly remarked, you have that for a matrix $Z$, $0$ is an eigenvalue of $Z$ iff $p_Z(0)=0$ iff $det(Z-0E_n)=0$ iff $det(Z)=0$ iff $Z$ is singular, i.e. not invertible.



              For (d), the rank-nullity theorem seems to be the best way to go.




              Now for (a), note that for $Z=(z_1,dots,z_n)$ being the column representation of $Z$, we have that if $z_1,dots,z_n$ span $mathbbR^n$, then f.a $vinmathbbR^n$, there exists $x_1,dots,x_ninmathbbR$ s.t. $sum_i=1^nx_iz_i=v$. Thus, using matrix multiplication, for every $vinmathbbR^n$, there exists an $xinmathbbR^n$ s.t. $Zx=v$, as $Zx=sum_i=1^nx_iz_i$ for $x=(x_1,dots,x_n)$.



              Thus, the linear map $phi_Z(x)=Zx$ for $xinmathbbR^n$ is surjective. Now, by the rank-nullity theorem, a linear map between two spaces with the same dimension (as here $mathbbR^n$) is surjective iff it is injective iff it is bijective. Note that this only holds in finite dimensions. Thus $phi_Z$ is bijective and thus $Z$ is invertible.




              To round this off, in your case of the finite dimensional $mathbbR^n$ you have $Z$ is invertible iff the columns of $Z$ span $mathbbR^n$ iff the columns of $Z$ are linearly independent. The last of which would correspond by a similar argument as above to injectivity(check this yourself) and thus we'd apply the argument again that this implies bijectivity rightout.






              share|cite|improve this answer













              You're right about your assertions. Let me just give some notes on them as well:




              For (b), you might think about the identity, certainly invertible but with every vector an eigenvector of $1$(besides $mathbf0$ as the null-vector never is an eigenvector).



              For (c), as you rightly remarked, you have that for a matrix $Z$, $0$ is an eigenvalue of $Z$ iff $p_Z(0)=0$ iff $det(Z-0E_n)=0$ iff $det(Z)=0$ iff $Z$ is singular, i.e. not invertible.



              For (d), the rank-nullity theorem seems to be the best way to go.




              Now for (a), note that for $Z=(z_1,dots,z_n)$ being the column representation of $Z$, we have that if $z_1,dots,z_n$ span $mathbbR^n$, then f.a $vinmathbbR^n$, there exists $x_1,dots,x_ninmathbbR$ s.t. $sum_i=1^nx_iz_i=v$. Thus, using matrix multiplication, for every $vinmathbbR^n$, there exists an $xinmathbbR^n$ s.t. $Zx=v$, as $Zx=sum_i=1^nx_iz_i$ for $x=(x_1,dots,x_n)$.



              Thus, the linear map $phi_Z(x)=Zx$ for $xinmathbbR^n$ is surjective. Now, by the rank-nullity theorem, a linear map between two spaces with the same dimension (as here $mathbbR^n$) is surjective iff it is injective iff it is bijective. Note that this only holds in finite dimensions. Thus $phi_Z$ is bijective and thus $Z$ is invertible.




              To round this off, in your case of the finite dimensional $mathbbR^n$ you have $Z$ is invertible iff the columns of $Z$ span $mathbbR^n$ iff the columns of $Z$ are linearly independent. The last of which would correspond by a similar argument as above to injectivity(check this yourself) and thus we'd apply the argument again that this implies bijectivity rightout.







              share|cite|improve this answer













              share|cite|improve this answer



              share|cite|improve this answer











              answered Aug 3 at 13:29









              zzuussee

              1,101419




              1,101419




















                  up vote
                  1
                  down vote













                  A is true. $mathbb R^n$ is an $n$-dimensional vector space, so if $n$ vectors span $mathbb R^n$ then these $n$ vectors are a basis of $mathbb R^n$. Hence the columns of the matrix $Z$ span $mathbb R^n$ iff the columns are linearly independent, iff the system of equations $Zx=0$ has only the trivial solution, iff $Z$ is invertible.






                  share|cite|improve this answer

























                    up vote
                    1
                    down vote













                    A is true. $mathbb R^n$ is an $n$-dimensional vector space, so if $n$ vectors span $mathbb R^n$ then these $n$ vectors are a basis of $mathbb R^n$. Hence the columns of the matrix $Z$ span $mathbb R^n$ iff the columns are linearly independent, iff the system of equations $Zx=0$ has only the trivial solution, iff $Z$ is invertible.






                    share|cite|improve this answer























                      up vote
                      1
                      down vote










                      up vote
                      1
                      down vote









                      A is true. $mathbb R^n$ is an $n$-dimensional vector space, so if $n$ vectors span $mathbb R^n$ then these $n$ vectors are a basis of $mathbb R^n$. Hence the columns of the matrix $Z$ span $mathbb R^n$ iff the columns are linearly independent, iff the system of equations $Zx=0$ has only the trivial solution, iff $Z$ is invertible.






                      share|cite|improve this answer













                      A is true. $mathbb R^n$ is an $n$-dimensional vector space, so if $n$ vectors span $mathbb R^n$ then these $n$ vectors are a basis of $mathbb R^n$. Hence the columns of the matrix $Z$ span $mathbb R^n$ iff the columns are linearly independent, iff the system of equations $Zx=0$ has only the trivial solution, iff $Z$ is invertible.







                      share|cite|improve this answer













                      share|cite|improve this answer



                      share|cite|improve this answer











                      answered Aug 3 at 13:27









                      Mark

                      5949




                      5949




















                          up vote
                          1
                          down vote













                          We have




                          • That the columns of Z span R^n



                          Yes indeed in that case the matrix is full rank and invertible.




                          • That Z has no eigenvectors



                          This point does not make sense.




                          • That Z has no eigenvectors of eigenvalue = 0



                          In that case A is not invertible since $det(A)=prod(lambda_i)=0$




                          • Kernel of Z = 0



                          Yes indeed that implies that columns are independent ($A$ is full rank).






                          share|cite|improve this answer

























                            up vote
                            1
                            down vote













                            We have




                            • That the columns of Z span R^n



                            Yes indeed in that case the matrix is full rank and invertible.




                            • That Z has no eigenvectors



                            This point does not make sense.




                            • That Z has no eigenvectors of eigenvalue = 0



                            In that case A is not invertible since $det(A)=prod(lambda_i)=0$




                            • Kernel of Z = 0



                            Yes indeed that implies that columns are independent ($A$ is full rank).






                            share|cite|improve this answer























                              up vote
                              1
                              down vote










                              up vote
                              1
                              down vote









                              We have




                              • That the columns of Z span R^n



                              Yes indeed in that case the matrix is full rank and invertible.




                              • That Z has no eigenvectors



                              This point does not make sense.




                              • That Z has no eigenvectors of eigenvalue = 0



                              In that case A is not invertible since $det(A)=prod(lambda_i)=0$




                              • Kernel of Z = 0



                              Yes indeed that implies that columns are independent ($A$ is full rank).






                              share|cite|improve this answer













                              We have




                              • That the columns of Z span R^n



                              Yes indeed in that case the matrix is full rank and invertible.




                              • That Z has no eigenvectors



                              This point does not make sense.




                              • That Z has no eigenvectors of eigenvalue = 0



                              In that case A is not invertible since $det(A)=prod(lambda_i)=0$




                              • Kernel of Z = 0



                              Yes indeed that implies that columns are independent ($A$ is full rank).







                              share|cite|improve this answer













                              share|cite|improve this answer



                              share|cite|improve this answer











                              answered Aug 3 at 13:31









                              gimusi

                              63.7k73480




                              63.7k73480




















                                  up vote
                                  0
                                  down vote













                                  The statements follow the Invertible Matrix Theorem.. The 1st, 3rd and 4th all work but the 2nd doesn't make much sense.



                                  A is true because if the columns span $mathbbR^n$ they are linearly independent then invertible






                                  share|cite|improve this answer

























                                    up vote
                                    0
                                    down vote













                                    The statements follow the Invertible Matrix Theorem.. The 1st, 3rd and 4th all work but the 2nd doesn't make much sense.



                                    A is true because if the columns span $mathbbR^n$ they are linearly independent then invertible






                                    share|cite|improve this answer























                                      up vote
                                      0
                                      down vote










                                      up vote
                                      0
                                      down vote









                                      The statements follow the Invertible Matrix Theorem.. The 1st, 3rd and 4th all work but the 2nd doesn't make much sense.



                                      A is true because if the columns span $mathbbR^n$ they are linearly independent then invertible






                                      share|cite|improve this answer













                                      The statements follow the Invertible Matrix Theorem.. The 1st, 3rd and 4th all work but the 2nd doesn't make much sense.



                                      A is true because if the columns span $mathbbR^n$ they are linearly independent then invertible







                                      share|cite|improve this answer













                                      share|cite|improve this answer



                                      share|cite|improve this answer











                                      answered Aug 3 at 13:26









                                      RHowe

                                      803715




                                      803715






















                                           

                                          draft saved


                                          draft discarded


























                                           


                                          draft saved


                                          draft discarded














                                          StackExchange.ready(
                                          function ()
                                          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2871047%2fchecking-invertibility-of-square-matrix%23new-answer', 'question_page');

                                          );

                                          Post as a guest













































































                                          Comments

                                          Popular posts from this blog

                                          What is the equation of a 3D cone with generalised tilt?

                                          Color the edges and diagonals of a regular polygon

                                          Relationship between determinant of matrix and determinant of adjoint?