Spanning Vectors - proof [duplicate]

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
-1
down vote

favorite
1













This question already has an answer here:



  • Basis for a subspace Proof

    3 answers



Prove that: If v and w span V, then v+w and v-w also span V.



my thinking behind this is that since the two vectors are linear combinations of v and w that is why they span V, but am unsure how to properly go about it.







share|cite|improve this question











marked as duplicate by Arthur, Michael Hoppe, John Ma, Lord Shark the Unknown, Simply Beautiful Art Aug 1 at 14:52


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.














  • math.stackexchange.com/questions/2866751/…
    – Fred
    Jul 30 at 9:22






  • 1




    In general, linear combinations of spanning vectors won't span the vector space. For example, $e_1=beginpmatrix1\0endpmatrix$ and $e_2=beginpmatrix0\1endpmatrix$ span $mathbbR^2$, but $e_1 + e_2$ and $2e_1 + 2e_2$ only span a $1$-dimensional subspace.
    – Babelfish
    Jul 30 at 9:32














up vote
-1
down vote

favorite
1













This question already has an answer here:



  • Basis for a subspace Proof

    3 answers



Prove that: If v and w span V, then v+w and v-w also span V.



my thinking behind this is that since the two vectors are linear combinations of v and w that is why they span V, but am unsure how to properly go about it.







share|cite|improve this question











marked as duplicate by Arthur, Michael Hoppe, John Ma, Lord Shark the Unknown, Simply Beautiful Art Aug 1 at 14:52


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.














  • math.stackexchange.com/questions/2866751/…
    – Fred
    Jul 30 at 9:22






  • 1




    In general, linear combinations of spanning vectors won't span the vector space. For example, $e_1=beginpmatrix1\0endpmatrix$ and $e_2=beginpmatrix0\1endpmatrix$ span $mathbbR^2$, but $e_1 + e_2$ and $2e_1 + 2e_2$ only span a $1$-dimensional subspace.
    – Babelfish
    Jul 30 at 9:32












up vote
-1
down vote

favorite
1









up vote
-1
down vote

favorite
1






1






This question already has an answer here:



  • Basis for a subspace Proof

    3 answers



Prove that: If v and w span V, then v+w and v-w also span V.



my thinking behind this is that since the two vectors are linear combinations of v and w that is why they span V, but am unsure how to properly go about it.







share|cite|improve this question












This question already has an answer here:



  • Basis for a subspace Proof

    3 answers



Prove that: If v and w span V, then v+w and v-w also span V.



my thinking behind this is that since the two vectors are linear combinations of v and w that is why they span V, but am unsure how to properly go about it.





This question already has an answer here:



  • Basis for a subspace Proof

    3 answers









share|cite|improve this question










share|cite|improve this question




share|cite|improve this question









asked Jul 30 at 9:21









J-Dorman

555




555




marked as duplicate by Arthur, Michael Hoppe, John Ma, Lord Shark the Unknown, Simply Beautiful Art Aug 1 at 14:52


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.






marked as duplicate by Arthur, Michael Hoppe, John Ma, Lord Shark the Unknown, Simply Beautiful Art Aug 1 at 14:52


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.













  • math.stackexchange.com/questions/2866751/…
    – Fred
    Jul 30 at 9:22






  • 1




    In general, linear combinations of spanning vectors won't span the vector space. For example, $e_1=beginpmatrix1\0endpmatrix$ and $e_2=beginpmatrix0\1endpmatrix$ span $mathbbR^2$, but $e_1 + e_2$ and $2e_1 + 2e_2$ only span a $1$-dimensional subspace.
    – Babelfish
    Jul 30 at 9:32
















  • math.stackexchange.com/questions/2866751/…
    – Fred
    Jul 30 at 9:22






  • 1




    In general, linear combinations of spanning vectors won't span the vector space. For example, $e_1=beginpmatrix1\0endpmatrix$ and $e_2=beginpmatrix0\1endpmatrix$ span $mathbbR^2$, but $e_1 + e_2$ and $2e_1 + 2e_2$ only span a $1$-dimensional subspace.
    – Babelfish
    Jul 30 at 9:32















math.stackexchange.com/questions/2866751/…
– Fred
Jul 30 at 9:22




math.stackexchange.com/questions/2866751/…
– Fred
Jul 30 at 9:22




1




1




In general, linear combinations of spanning vectors won't span the vector space. For example, $e_1=beginpmatrix1\0endpmatrix$ and $e_2=beginpmatrix0\1endpmatrix$ span $mathbbR^2$, but $e_1 + e_2$ and $2e_1 + 2e_2$ only span a $1$-dimensional subspace.
– Babelfish
Jul 30 at 9:32




In general, linear combinations of spanning vectors won't span the vector space. For example, $e_1=beginpmatrix1\0endpmatrix$ and $e_2=beginpmatrix0\1endpmatrix$ span $mathbbR^2$, but $e_1 + e_2$ and $2e_1 + 2e_2$ only span a $1$-dimensional subspace.
– Babelfish
Jul 30 at 9:32










2 Answers
2






active

oldest

votes

















up vote
3
down vote













Consider any vector in the span



$$u=av+bw$$



then



$$u=c(v+w)+d(v-w)=(c+d)v+(c-d)w$$



then choose



  • $c+d=a$

  • $c-d=b$

that is



  • $c=fraca+b2$

  • $d=fraca-b2$

$$u=av+bw=fraca+b2(v+w)+fraca-b2(v-w)$$



and therefore also $v+w$ and $v-w$ span $V$.






share|cite|improve this answer




























    up vote
    3
    down vote













    Note that $langle v+w,v-wranglesubseteqlangle v,wrangle$, bwcause any linear combination of $v+w$ and $v-w$ is obviously a linear combination of $v$ and $w$.



    On the other hand,
    $$
    v=frac12(v+w)+frac12(v-w)
    qquad
    w=frac12(v+w)-frac12(v-w)
    $$
    For the same reason as before, $langle v,wranglesubseteqlangle v+w,v-wrangle$.






    share|cite|improve this answer




























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes








      up vote
      3
      down vote













      Consider any vector in the span



      $$u=av+bw$$



      then



      $$u=c(v+w)+d(v-w)=(c+d)v+(c-d)w$$



      then choose



      • $c+d=a$

      • $c-d=b$

      that is



      • $c=fraca+b2$

      • $d=fraca-b2$

      $$u=av+bw=fraca+b2(v+w)+fraca-b2(v-w)$$



      and therefore also $v+w$ and $v-w$ span $V$.






      share|cite|improve this answer

























        up vote
        3
        down vote













        Consider any vector in the span



        $$u=av+bw$$



        then



        $$u=c(v+w)+d(v-w)=(c+d)v+(c-d)w$$



        then choose



        • $c+d=a$

        • $c-d=b$

        that is



        • $c=fraca+b2$

        • $d=fraca-b2$

        $$u=av+bw=fraca+b2(v+w)+fraca-b2(v-w)$$



        and therefore also $v+w$ and $v-w$ span $V$.






        share|cite|improve this answer























          up vote
          3
          down vote










          up vote
          3
          down vote









          Consider any vector in the span



          $$u=av+bw$$



          then



          $$u=c(v+w)+d(v-w)=(c+d)v+(c-d)w$$



          then choose



          • $c+d=a$

          • $c-d=b$

          that is



          • $c=fraca+b2$

          • $d=fraca-b2$

          $$u=av+bw=fraca+b2(v+w)+fraca-b2(v-w)$$



          and therefore also $v+w$ and $v-w$ span $V$.






          share|cite|improve this answer













          Consider any vector in the span



          $$u=av+bw$$



          then



          $$u=c(v+w)+d(v-w)=(c+d)v+(c-d)w$$



          then choose



          • $c+d=a$

          • $c-d=b$

          that is



          • $c=fraca+b2$

          • $d=fraca-b2$

          $$u=av+bw=fraca+b2(v+w)+fraca-b2(v-w)$$



          and therefore also $v+w$ and $v-w$ span $V$.







          share|cite|improve this answer













          share|cite|improve this answer



          share|cite|improve this answer











          answered Jul 30 at 10:08









          gimusi

          64.5k73482




          64.5k73482




















              up vote
              3
              down vote













              Note that $langle v+w,v-wranglesubseteqlangle v,wrangle$, bwcause any linear combination of $v+w$ and $v-w$ is obviously a linear combination of $v$ and $w$.



              On the other hand,
              $$
              v=frac12(v+w)+frac12(v-w)
              qquad
              w=frac12(v+w)-frac12(v-w)
              $$
              For the same reason as before, $langle v,wranglesubseteqlangle v+w,v-wrangle$.






              share|cite|improve this answer

























                up vote
                3
                down vote













                Note that $langle v+w,v-wranglesubseteqlangle v,wrangle$, bwcause any linear combination of $v+w$ and $v-w$ is obviously a linear combination of $v$ and $w$.



                On the other hand,
                $$
                v=frac12(v+w)+frac12(v-w)
                qquad
                w=frac12(v+w)-frac12(v-w)
                $$
                For the same reason as before, $langle v,wranglesubseteqlangle v+w,v-wrangle$.






                share|cite|improve this answer























                  up vote
                  3
                  down vote










                  up vote
                  3
                  down vote









                  Note that $langle v+w,v-wranglesubseteqlangle v,wrangle$, bwcause any linear combination of $v+w$ and $v-w$ is obviously a linear combination of $v$ and $w$.



                  On the other hand,
                  $$
                  v=frac12(v+w)+frac12(v-w)
                  qquad
                  w=frac12(v+w)-frac12(v-w)
                  $$
                  For the same reason as before, $langle v,wranglesubseteqlangle v+w,v-wrangle$.






                  share|cite|improve this answer













                  Note that $langle v+w,v-wranglesubseteqlangle v,wrangle$, bwcause any linear combination of $v+w$ and $v-w$ is obviously a linear combination of $v$ and $w$.



                  On the other hand,
                  $$
                  v=frac12(v+w)+frac12(v-w)
                  qquad
                  w=frac12(v+w)-frac12(v-w)
                  $$
                  For the same reason as before, $langle v,wranglesubseteqlangle v+w,v-wrangle$.







                  share|cite|improve this answer













                  share|cite|improve this answer



                  share|cite|improve this answer











                  answered Jul 30 at 11:37









                  egreg

                  164k1180187




                  164k1180187












                      Comments

                      Popular posts from this blog

                      What is the equation of a 3D cone with generalised tilt?

                      Color the edges and diagonals of a regular polygon

                      Relationship between determinant of matrix and determinant of adjoint?