an orthogonal map associated with inner product

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
2
down vote

favorite
2












Let $rleq n$ be a natural number and let $v_1,v_2,dots ,v_r$ and $w_1,w_2,dots ,w_r$ be two linearly independent subsets of $mathbbR^n$ such that $langle v_i,v_j rangle = langle w_i,w_j rangle forall 1leq i,j leq r$, where $langle ,rangle$ denotes the standard inner product on $mathbbR^n$. Prove that there exists an orthogonal operator $T$ on $mathbbR^n$ such that $T(v_i)=w_i$ for all $1leq i leq r$.



An orthogonal mapping preserves inner products, and it is also known that $T$ is orthogonal on $mathbbR^n$ if $langle T(alpha),T(beta)rangle = langle alpha ,beta rangle$ for all $alpha,beta in mathbbR^n$. But the question here demands to be proved that the existence of such an orthogonal map. So do I need to find an example of such map or is there another way to prove that in general? Any help will be appreciated.







share|cite|improve this question























    up vote
    2
    down vote

    favorite
    2












    Let $rleq n$ be a natural number and let $v_1,v_2,dots ,v_r$ and $w_1,w_2,dots ,w_r$ be two linearly independent subsets of $mathbbR^n$ such that $langle v_i,v_j rangle = langle w_i,w_j rangle forall 1leq i,j leq r$, where $langle ,rangle$ denotes the standard inner product on $mathbbR^n$. Prove that there exists an orthogonal operator $T$ on $mathbbR^n$ such that $T(v_i)=w_i$ for all $1leq i leq r$.



    An orthogonal mapping preserves inner products, and it is also known that $T$ is orthogonal on $mathbbR^n$ if $langle T(alpha),T(beta)rangle = langle alpha ,beta rangle$ for all $alpha,beta in mathbbR^n$. But the question here demands to be proved that the existence of such an orthogonal map. So do I need to find an example of such map or is there another way to prove that in general? Any help will be appreciated.







    share|cite|improve this question





















      up vote
      2
      down vote

      favorite
      2









      up vote
      2
      down vote

      favorite
      2






      2





      Let $rleq n$ be a natural number and let $v_1,v_2,dots ,v_r$ and $w_1,w_2,dots ,w_r$ be two linearly independent subsets of $mathbbR^n$ such that $langle v_i,v_j rangle = langle w_i,w_j rangle forall 1leq i,j leq r$, where $langle ,rangle$ denotes the standard inner product on $mathbbR^n$. Prove that there exists an orthogonal operator $T$ on $mathbbR^n$ such that $T(v_i)=w_i$ for all $1leq i leq r$.



      An orthogonal mapping preserves inner products, and it is also known that $T$ is orthogonal on $mathbbR^n$ if $langle T(alpha),T(beta)rangle = langle alpha ,beta rangle$ for all $alpha,beta in mathbbR^n$. But the question here demands to be proved that the existence of such an orthogonal map. So do I need to find an example of such map or is there another way to prove that in general? Any help will be appreciated.







      share|cite|improve this question











      Let $rleq n$ be a natural number and let $v_1,v_2,dots ,v_r$ and $w_1,w_2,dots ,w_r$ be two linearly independent subsets of $mathbbR^n$ such that $langle v_i,v_j rangle = langle w_i,w_j rangle forall 1leq i,j leq r$, where $langle ,rangle$ denotes the standard inner product on $mathbbR^n$. Prove that there exists an orthogonal operator $T$ on $mathbbR^n$ such that $T(v_i)=w_i$ for all $1leq i leq r$.



      An orthogonal mapping preserves inner products, and it is also known that $T$ is orthogonal on $mathbbR^n$ if $langle T(alpha),T(beta)rangle = langle alpha ,beta rangle$ for all $alpha,beta in mathbbR^n$. But the question here demands to be proved that the existence of such an orthogonal map. So do I need to find an example of such map or is there another way to prove that in general? Any help will be appreciated.









      share|cite|improve this question










      share|cite|improve this question




      share|cite|improve this question









      asked Jul 27 at 16:28









      am_11235...

      1797




      1797




















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          0
          down vote













          You define $Tv_j=w_j$ for all $j=1,ldots,r$, and since both sets are linearly independent, $T$ gets defined by linearity on their span.



          But now you have the problem that maybe $r<n$. Expanding both sets to bases does not work right away, because you have no way to preserve the inner product relations. So what you do is take $v_r+1,ldots,v_n^perp$ to be an orthonormal basis of $v_1,ldots,v_r^perp$, and $w_r+1,ldots,w_n$ an orthonormal basis of $w_1,ldots,w_r^perp$. Then define $Tv_j=w_j$ for all $j=1,ldots, n$. The inner product relations are preserved because ${v_j,v_krangle=delta_kj=langle w_j,w_krangle$ for $j,k>r$, and when one index is below $r$ and the other above, the inner product is zero by the orthogonality.






          share|cite|improve this answer




























            up vote
            0
            down vote













            Let the subspace



            $V = textspan( v_1, v_2, ldots, v_r ) = textspan( w_1, w_2, ldots, w_r ); tag 1$



            we define $T$ on $V$ by setting



            $Tv_k = w_k, ; 1 le k le r, tag 2$



            and extending $T$ by linearity to all of $V$, viz:



            $Tleft ( displaystyle sum_1^r alpha_i v_i right ) = displaystylesum_1^r alpha_i Tv_i = sum_1^r alpha_i w_i; tag 3$



            that such an extension is possible follows from the linear independence of the $v_i$ and the $w_i$ as is well-known.



            If



            $alpha, beta in V tag 4$



            we may write



            $alpha = displaystyle sum_1^r alpha_i v_i, ; beta = sum_1^r beta_j v_j, tag 5$



            whence



            $left langle Talpha, Tbeta right rangle = left langle T left (displaystyle sum_1^r alpha_i v_i right), T left (displaystyle sum_1^r beta_j v_j right) right rangle = left langle displaystyle sum_1^r alpha_i Tv_i, displaystyle sum_1^r beta_j Tv_j right rangle = displaystyle sum_i, j = 1^r alpha_i beta_j langle Tv_i, Tv_j rangle = sum_i, j = 1^r alpha_i beta_j langle w_i, w_j rangle = sum_i, j = 1^r alpha_i beta_j langle v_i, v_j rangle$
            $= left langle displaystyle sum_1^r alpha_i v_i, displaystyle sum_1^r beta_j v_j right rangle = langle alpha, beta rangle, tag 6$



            which proves that $T:V to V$ is orthogonal.



            In the event that $r < n$, we extend $T$ from $V$ to all of $Bbb R^n$ as follows: let $W subset Bbb R^n$ be the orthogonal compliment of $V$, $W = V^bot$; then we have



            $Bbb R^n = V oplus W; tag 7$



            we observe that any $x in Bbb R^n$ may be represented uniquely as



            $x = v + w, ; v in V, w in W; tag8$



            existence of such a decomposition follows directly from (7), i.e. from the fact that $W = V^bot$; to see uniqueness, suppose



            $x = v_1 + w_1 = v_2 + w_2, ; v_1, v_2 in V, w_1, w_2 in W; tag 9$



            then



            $v_1 - v_2 = w_2 - w_1, tag10$



            which forces



            $v_1 - v_2 = 0 = w_2 - w_1, tag11$



            since



            $v_1 - v_2 in V, ; w_1 - w_2 in W = V^bot, tag12$



            and hence



            $Vert v_1 - v_2 Vert^2 = langle v_1 - v_2, v_1 - v_2 rangle = langle v_1 - v_2, w_2 - w_1 rangle = 0, tag13$



            implying $v_1 = v_2$, and thus $w_1 = w_2$ via (11); so the decomposition (8) is unique; this uniqueness means we may unabiguously define an extension $T_E$ of $T$ from $V$ to $V oplus W = Bbb R^n$ by



            $T_E(v + w) = Tv + w, ; v in V, w in W; tag14$



            that is,



            $T_E = T oplus I: V oplus W to V oplus W; tag15$



            we can formally show $T_E$ is orthogonal on all of $Bbb R^n$ if we observe that, for $v_1, v_2 in V$, $w_1, w_2 in W$,



            $langle v_1 + w_1, v_2 + w_2 rangle$
            $= langle v_1, v_2 rangle + langle v_1, w_2 rangle + langle w_1, v_2 rangle + langle w_1, w_2 rangle = langle v_1, v_2 rangle + langle w_1, w_2 rangle; tag16$



            thus,



            $langle T_E(v_1 + w_1), T_E(v_2 + w_2) rangle = langle (T oplus I)(v_1 + w_1), (T oplus I)(v_2 + w_2) rangle$
            $= langle Tv_1 + w_1, Tv_2 + w_2 rangle = langle Tv_1, Tv_2 rangle + langle w_1, w_2 rangle$
            $= langle v_1, v_2 rangle + langle w_1, w_2 rangle = langle v_1 + w_1, v_2 + w_2 rangle, tag17$



            which shows that $T_E = T oplus I$ is orthogonal on $Bbb R^n$.






            share|cite|improve this answer





















              Your Answer




              StackExchange.ifUsing("editor", function ()
              return StackExchange.using("mathjaxEditing", function ()
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              );
              );
              , "mathjax-editing");

              StackExchange.ready(function()
              var channelOptions =
              tags: "".split(" "),
              id: "69"
              ;
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function()
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled)
              StackExchange.using("snippets", function()
              createEditor();
              );

              else
              createEditor();

              );

              function createEditor()
              StackExchange.prepareEditor(
              heartbeatType: 'answer',
              convertImagesToLinks: true,
              noModals: false,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              );



              );








               

              draft saved


              draft discarded


















              StackExchange.ready(
              function ()
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2864563%2fan-orthogonal-map-associated-with-inner-product%23new-answer', 'question_page');

              );

              Post as a guest






























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes








              up vote
              0
              down vote













              You define $Tv_j=w_j$ for all $j=1,ldots,r$, and since both sets are linearly independent, $T$ gets defined by linearity on their span.



              But now you have the problem that maybe $r<n$. Expanding both sets to bases does not work right away, because you have no way to preserve the inner product relations. So what you do is take $v_r+1,ldots,v_n^perp$ to be an orthonormal basis of $v_1,ldots,v_r^perp$, and $w_r+1,ldots,w_n$ an orthonormal basis of $w_1,ldots,w_r^perp$. Then define $Tv_j=w_j$ for all $j=1,ldots, n$. The inner product relations are preserved because ${v_j,v_krangle=delta_kj=langle w_j,w_krangle$ for $j,k>r$, and when one index is below $r$ and the other above, the inner product is zero by the orthogonality.






              share|cite|improve this answer

























                up vote
                0
                down vote













                You define $Tv_j=w_j$ for all $j=1,ldots,r$, and since both sets are linearly independent, $T$ gets defined by linearity on their span.



                But now you have the problem that maybe $r<n$. Expanding both sets to bases does not work right away, because you have no way to preserve the inner product relations. So what you do is take $v_r+1,ldots,v_n^perp$ to be an orthonormal basis of $v_1,ldots,v_r^perp$, and $w_r+1,ldots,w_n$ an orthonormal basis of $w_1,ldots,w_r^perp$. Then define $Tv_j=w_j$ for all $j=1,ldots, n$. The inner product relations are preserved because ${v_j,v_krangle=delta_kj=langle w_j,w_krangle$ for $j,k>r$, and when one index is below $r$ and the other above, the inner product is zero by the orthogonality.






                share|cite|improve this answer























                  up vote
                  0
                  down vote










                  up vote
                  0
                  down vote









                  You define $Tv_j=w_j$ for all $j=1,ldots,r$, and since both sets are linearly independent, $T$ gets defined by linearity on their span.



                  But now you have the problem that maybe $r<n$. Expanding both sets to bases does not work right away, because you have no way to preserve the inner product relations. So what you do is take $v_r+1,ldots,v_n^perp$ to be an orthonormal basis of $v_1,ldots,v_r^perp$, and $w_r+1,ldots,w_n$ an orthonormal basis of $w_1,ldots,w_r^perp$. Then define $Tv_j=w_j$ for all $j=1,ldots, n$. The inner product relations are preserved because ${v_j,v_krangle=delta_kj=langle w_j,w_krangle$ for $j,k>r$, and when one index is below $r$ and the other above, the inner product is zero by the orthogonality.






                  share|cite|improve this answer













                  You define $Tv_j=w_j$ for all $j=1,ldots,r$, and since both sets are linearly independent, $T$ gets defined by linearity on their span.



                  But now you have the problem that maybe $r<n$. Expanding both sets to bases does not work right away, because you have no way to preserve the inner product relations. So what you do is take $v_r+1,ldots,v_n^perp$ to be an orthonormal basis of $v_1,ldots,v_r^perp$, and $w_r+1,ldots,w_n$ an orthonormal basis of $w_1,ldots,w_r^perp$. Then define $Tv_j=w_j$ for all $j=1,ldots, n$. The inner product relations are preserved because ${v_j,v_krangle=delta_kj=langle w_j,w_krangle$ for $j,k>r$, and when one index is below $r$ and the other above, the inner product is zero by the orthogonality.







                  share|cite|improve this answer













                  share|cite|improve this answer



                  share|cite|improve this answer











                  answered Jul 27 at 16:35









                  Martin Argerami

                  115k1071164




                  115k1071164




















                      up vote
                      0
                      down vote













                      Let the subspace



                      $V = textspan( v_1, v_2, ldots, v_r ) = textspan( w_1, w_2, ldots, w_r ); tag 1$



                      we define $T$ on $V$ by setting



                      $Tv_k = w_k, ; 1 le k le r, tag 2$



                      and extending $T$ by linearity to all of $V$, viz:



                      $Tleft ( displaystyle sum_1^r alpha_i v_i right ) = displaystylesum_1^r alpha_i Tv_i = sum_1^r alpha_i w_i; tag 3$



                      that such an extension is possible follows from the linear independence of the $v_i$ and the $w_i$ as is well-known.



                      If



                      $alpha, beta in V tag 4$



                      we may write



                      $alpha = displaystyle sum_1^r alpha_i v_i, ; beta = sum_1^r beta_j v_j, tag 5$



                      whence



                      $left langle Talpha, Tbeta right rangle = left langle T left (displaystyle sum_1^r alpha_i v_i right), T left (displaystyle sum_1^r beta_j v_j right) right rangle = left langle displaystyle sum_1^r alpha_i Tv_i, displaystyle sum_1^r beta_j Tv_j right rangle = displaystyle sum_i, j = 1^r alpha_i beta_j langle Tv_i, Tv_j rangle = sum_i, j = 1^r alpha_i beta_j langle w_i, w_j rangle = sum_i, j = 1^r alpha_i beta_j langle v_i, v_j rangle$
                      $= left langle displaystyle sum_1^r alpha_i v_i, displaystyle sum_1^r beta_j v_j right rangle = langle alpha, beta rangle, tag 6$



                      which proves that $T:V to V$ is orthogonal.



                      In the event that $r < n$, we extend $T$ from $V$ to all of $Bbb R^n$ as follows: let $W subset Bbb R^n$ be the orthogonal compliment of $V$, $W = V^bot$; then we have



                      $Bbb R^n = V oplus W; tag 7$



                      we observe that any $x in Bbb R^n$ may be represented uniquely as



                      $x = v + w, ; v in V, w in W; tag8$



                      existence of such a decomposition follows directly from (7), i.e. from the fact that $W = V^bot$; to see uniqueness, suppose



                      $x = v_1 + w_1 = v_2 + w_2, ; v_1, v_2 in V, w_1, w_2 in W; tag 9$



                      then



                      $v_1 - v_2 = w_2 - w_1, tag10$



                      which forces



                      $v_1 - v_2 = 0 = w_2 - w_1, tag11$



                      since



                      $v_1 - v_2 in V, ; w_1 - w_2 in W = V^bot, tag12$



                      and hence



                      $Vert v_1 - v_2 Vert^2 = langle v_1 - v_2, v_1 - v_2 rangle = langle v_1 - v_2, w_2 - w_1 rangle = 0, tag13$



                      implying $v_1 = v_2$, and thus $w_1 = w_2$ via (11); so the decomposition (8) is unique; this uniqueness means we may unabiguously define an extension $T_E$ of $T$ from $V$ to $V oplus W = Bbb R^n$ by



                      $T_E(v + w) = Tv + w, ; v in V, w in W; tag14$



                      that is,



                      $T_E = T oplus I: V oplus W to V oplus W; tag15$



                      we can formally show $T_E$ is orthogonal on all of $Bbb R^n$ if we observe that, for $v_1, v_2 in V$, $w_1, w_2 in W$,



                      $langle v_1 + w_1, v_2 + w_2 rangle$
                      $= langle v_1, v_2 rangle + langle v_1, w_2 rangle + langle w_1, v_2 rangle + langle w_1, w_2 rangle = langle v_1, v_2 rangle + langle w_1, w_2 rangle; tag16$



                      thus,



                      $langle T_E(v_1 + w_1), T_E(v_2 + w_2) rangle = langle (T oplus I)(v_1 + w_1), (T oplus I)(v_2 + w_2) rangle$
                      $= langle Tv_1 + w_1, Tv_2 + w_2 rangle = langle Tv_1, Tv_2 rangle + langle w_1, w_2 rangle$
                      $= langle v_1, v_2 rangle + langle w_1, w_2 rangle = langle v_1 + w_1, v_2 + w_2 rangle, tag17$



                      which shows that $T_E = T oplus I$ is orthogonal on $Bbb R^n$.






                      share|cite|improve this answer

























                        up vote
                        0
                        down vote













                        Let the subspace



                        $V = textspan( v_1, v_2, ldots, v_r ) = textspan( w_1, w_2, ldots, w_r ); tag 1$



                        we define $T$ on $V$ by setting



                        $Tv_k = w_k, ; 1 le k le r, tag 2$



                        and extending $T$ by linearity to all of $V$, viz:



                        $Tleft ( displaystyle sum_1^r alpha_i v_i right ) = displaystylesum_1^r alpha_i Tv_i = sum_1^r alpha_i w_i; tag 3$



                        that such an extension is possible follows from the linear independence of the $v_i$ and the $w_i$ as is well-known.



                        If



                        $alpha, beta in V tag 4$



                        we may write



                        $alpha = displaystyle sum_1^r alpha_i v_i, ; beta = sum_1^r beta_j v_j, tag 5$



                        whence



                        $left langle Talpha, Tbeta right rangle = left langle T left (displaystyle sum_1^r alpha_i v_i right), T left (displaystyle sum_1^r beta_j v_j right) right rangle = left langle displaystyle sum_1^r alpha_i Tv_i, displaystyle sum_1^r beta_j Tv_j right rangle = displaystyle sum_i, j = 1^r alpha_i beta_j langle Tv_i, Tv_j rangle = sum_i, j = 1^r alpha_i beta_j langle w_i, w_j rangle = sum_i, j = 1^r alpha_i beta_j langle v_i, v_j rangle$
                        $= left langle displaystyle sum_1^r alpha_i v_i, displaystyle sum_1^r beta_j v_j right rangle = langle alpha, beta rangle, tag 6$



                        which proves that $T:V to V$ is orthogonal.



                        In the event that $r < n$, we extend $T$ from $V$ to all of $Bbb R^n$ as follows: let $W subset Bbb R^n$ be the orthogonal compliment of $V$, $W = V^bot$; then we have



                        $Bbb R^n = V oplus W; tag 7$



                        we observe that any $x in Bbb R^n$ may be represented uniquely as



                        $x = v + w, ; v in V, w in W; tag8$



                        existence of such a decomposition follows directly from (7), i.e. from the fact that $W = V^bot$; to see uniqueness, suppose



                        $x = v_1 + w_1 = v_2 + w_2, ; v_1, v_2 in V, w_1, w_2 in W; tag 9$



                        then



                        $v_1 - v_2 = w_2 - w_1, tag10$



                        which forces



                        $v_1 - v_2 = 0 = w_2 - w_1, tag11$



                        since



                        $v_1 - v_2 in V, ; w_1 - w_2 in W = V^bot, tag12$



                        and hence



                        $Vert v_1 - v_2 Vert^2 = langle v_1 - v_2, v_1 - v_2 rangle = langle v_1 - v_2, w_2 - w_1 rangle = 0, tag13$



                        implying $v_1 = v_2$, and thus $w_1 = w_2$ via (11); so the decomposition (8) is unique; this uniqueness means we may unabiguously define an extension $T_E$ of $T$ from $V$ to $V oplus W = Bbb R^n$ by



                        $T_E(v + w) = Tv + w, ; v in V, w in W; tag14$



                        that is,



                        $T_E = T oplus I: V oplus W to V oplus W; tag15$



                        we can formally show $T_E$ is orthogonal on all of $Bbb R^n$ if we observe that, for $v_1, v_2 in V$, $w_1, w_2 in W$,



                        $langle v_1 + w_1, v_2 + w_2 rangle$
                        $= langle v_1, v_2 rangle + langle v_1, w_2 rangle + langle w_1, v_2 rangle + langle w_1, w_2 rangle = langle v_1, v_2 rangle + langle w_1, w_2 rangle; tag16$



                        thus,



                        $langle T_E(v_1 + w_1), T_E(v_2 + w_2) rangle = langle (T oplus I)(v_1 + w_1), (T oplus I)(v_2 + w_2) rangle$
                        $= langle Tv_1 + w_1, Tv_2 + w_2 rangle = langle Tv_1, Tv_2 rangle + langle w_1, w_2 rangle$
                        $= langle v_1, v_2 rangle + langle w_1, w_2 rangle = langle v_1 + w_1, v_2 + w_2 rangle, tag17$



                        which shows that $T_E = T oplus I$ is orthogonal on $Bbb R^n$.






                        share|cite|improve this answer























                          up vote
                          0
                          down vote










                          up vote
                          0
                          down vote









                          Let the subspace



                          $V = textspan( v_1, v_2, ldots, v_r ) = textspan( w_1, w_2, ldots, w_r ); tag 1$



                          we define $T$ on $V$ by setting



                          $Tv_k = w_k, ; 1 le k le r, tag 2$



                          and extending $T$ by linearity to all of $V$, viz:



                          $Tleft ( displaystyle sum_1^r alpha_i v_i right ) = displaystylesum_1^r alpha_i Tv_i = sum_1^r alpha_i w_i; tag 3$



                          that such an extension is possible follows from the linear independence of the $v_i$ and the $w_i$ as is well-known.



                          If



                          $alpha, beta in V tag 4$



                          we may write



                          $alpha = displaystyle sum_1^r alpha_i v_i, ; beta = sum_1^r beta_j v_j, tag 5$



                          whence



                          $left langle Talpha, Tbeta right rangle = left langle T left (displaystyle sum_1^r alpha_i v_i right), T left (displaystyle sum_1^r beta_j v_j right) right rangle = left langle displaystyle sum_1^r alpha_i Tv_i, displaystyle sum_1^r beta_j Tv_j right rangle = displaystyle sum_i, j = 1^r alpha_i beta_j langle Tv_i, Tv_j rangle = sum_i, j = 1^r alpha_i beta_j langle w_i, w_j rangle = sum_i, j = 1^r alpha_i beta_j langle v_i, v_j rangle$
                          $= left langle displaystyle sum_1^r alpha_i v_i, displaystyle sum_1^r beta_j v_j right rangle = langle alpha, beta rangle, tag 6$



                          which proves that $T:V to V$ is orthogonal.



                          In the event that $r < n$, we extend $T$ from $V$ to all of $Bbb R^n$ as follows: let $W subset Bbb R^n$ be the orthogonal compliment of $V$, $W = V^bot$; then we have



                          $Bbb R^n = V oplus W; tag 7$



                          we observe that any $x in Bbb R^n$ may be represented uniquely as



                          $x = v + w, ; v in V, w in W; tag8$



                          existence of such a decomposition follows directly from (7), i.e. from the fact that $W = V^bot$; to see uniqueness, suppose



                          $x = v_1 + w_1 = v_2 + w_2, ; v_1, v_2 in V, w_1, w_2 in W; tag 9$



                          then



                          $v_1 - v_2 = w_2 - w_1, tag10$



                          which forces



                          $v_1 - v_2 = 0 = w_2 - w_1, tag11$



                          since



                          $v_1 - v_2 in V, ; w_1 - w_2 in W = V^bot, tag12$



                          and hence



                          $Vert v_1 - v_2 Vert^2 = langle v_1 - v_2, v_1 - v_2 rangle = langle v_1 - v_2, w_2 - w_1 rangle = 0, tag13$



                          implying $v_1 = v_2$, and thus $w_1 = w_2$ via (11); so the decomposition (8) is unique; this uniqueness means we may unabiguously define an extension $T_E$ of $T$ from $V$ to $V oplus W = Bbb R^n$ by



                          $T_E(v + w) = Tv + w, ; v in V, w in W; tag14$



                          that is,



                          $T_E = T oplus I: V oplus W to V oplus W; tag15$



                          we can formally show $T_E$ is orthogonal on all of $Bbb R^n$ if we observe that, for $v_1, v_2 in V$, $w_1, w_2 in W$,



                          $langle v_1 + w_1, v_2 + w_2 rangle$
                          $= langle v_1, v_2 rangle + langle v_1, w_2 rangle + langle w_1, v_2 rangle + langle w_1, w_2 rangle = langle v_1, v_2 rangle + langle w_1, w_2 rangle; tag16$



                          thus,



                          $langle T_E(v_1 + w_1), T_E(v_2 + w_2) rangle = langle (T oplus I)(v_1 + w_1), (T oplus I)(v_2 + w_2) rangle$
                          $= langle Tv_1 + w_1, Tv_2 + w_2 rangle = langle Tv_1, Tv_2 rangle + langle w_1, w_2 rangle$
                          $= langle v_1, v_2 rangle + langle w_1, w_2 rangle = langle v_1 + w_1, v_2 + w_2 rangle, tag17$



                          which shows that $T_E = T oplus I$ is orthogonal on $Bbb R^n$.






                          share|cite|improve this answer













                          Let the subspace



                          $V = textspan( v_1, v_2, ldots, v_r ) = textspan( w_1, w_2, ldots, w_r ); tag 1$



                          we define $T$ on $V$ by setting



                          $Tv_k = w_k, ; 1 le k le r, tag 2$



                          and extending $T$ by linearity to all of $V$, viz:



                          $Tleft ( displaystyle sum_1^r alpha_i v_i right ) = displaystylesum_1^r alpha_i Tv_i = sum_1^r alpha_i w_i; tag 3$



                          that such an extension is possible follows from the linear independence of the $v_i$ and the $w_i$ as is well-known.



                          If



                          $alpha, beta in V tag 4$



                          we may write



                          $alpha = displaystyle sum_1^r alpha_i v_i, ; beta = sum_1^r beta_j v_j, tag 5$



                          whence



                          $left langle Talpha, Tbeta right rangle = left langle T left (displaystyle sum_1^r alpha_i v_i right), T left (displaystyle sum_1^r beta_j v_j right) right rangle = left langle displaystyle sum_1^r alpha_i Tv_i, displaystyle sum_1^r beta_j Tv_j right rangle = displaystyle sum_i, j = 1^r alpha_i beta_j langle Tv_i, Tv_j rangle = sum_i, j = 1^r alpha_i beta_j langle w_i, w_j rangle = sum_i, j = 1^r alpha_i beta_j langle v_i, v_j rangle$
                          $= left langle displaystyle sum_1^r alpha_i v_i, displaystyle sum_1^r beta_j v_j right rangle = langle alpha, beta rangle, tag 6$



                          which proves that $T:V to V$ is orthogonal.



                          In the event that $r < n$, we extend $T$ from $V$ to all of $Bbb R^n$ as follows: let $W subset Bbb R^n$ be the orthogonal compliment of $V$, $W = V^bot$; then we have



                          $Bbb R^n = V oplus W; tag 7$



                          we observe that any $x in Bbb R^n$ may be represented uniquely as



                          $x = v + w, ; v in V, w in W; tag8$



                          existence of such a decomposition follows directly from (7), i.e. from the fact that $W = V^bot$; to see uniqueness, suppose



                          $x = v_1 + w_1 = v_2 + w_2, ; v_1, v_2 in V, w_1, w_2 in W; tag 9$



                          then



                          $v_1 - v_2 = w_2 - w_1, tag10$



                          which forces



                          $v_1 - v_2 = 0 = w_2 - w_1, tag11$



                          since



                          $v_1 - v_2 in V, ; w_1 - w_2 in W = V^bot, tag12$



                          and hence



                          $Vert v_1 - v_2 Vert^2 = langle v_1 - v_2, v_1 - v_2 rangle = langle v_1 - v_2, w_2 - w_1 rangle = 0, tag13$



                          implying $v_1 = v_2$, and thus $w_1 = w_2$ via (11); so the decomposition (8) is unique; this uniqueness means we may unabiguously define an extension $T_E$ of $T$ from $V$ to $V oplus W = Bbb R^n$ by



                          $T_E(v + w) = Tv + w, ; v in V, w in W; tag14$



                          that is,



                          $T_E = T oplus I: V oplus W to V oplus W; tag15$



                          we can formally show $T_E$ is orthogonal on all of $Bbb R^n$ if we observe that, for $v_1, v_2 in V$, $w_1, w_2 in W$,



                          $langle v_1 + w_1, v_2 + w_2 rangle$
                          $= langle v_1, v_2 rangle + langle v_1, w_2 rangle + langle w_1, v_2 rangle + langle w_1, w_2 rangle = langle v_1, v_2 rangle + langle w_1, w_2 rangle; tag16$



                          thus,



                          $langle T_E(v_1 + w_1), T_E(v_2 + w_2) rangle = langle (T oplus I)(v_1 + w_1), (T oplus I)(v_2 + w_2) rangle$
                          $= langle Tv_1 + w_1, Tv_2 + w_2 rangle = langle Tv_1, Tv_2 rangle + langle w_1, w_2 rangle$
                          $= langle v_1, v_2 rangle + langle w_1, w_2 rangle = langle v_1 + w_1, v_2 + w_2 rangle, tag17$



                          which shows that $T_E = T oplus I$ is orthogonal on $Bbb R^n$.







                          share|cite|improve this answer













                          share|cite|improve this answer



                          share|cite|improve this answer











                          answered Jul 28 at 1:54









                          Robert Lewis

                          36.8k22155




                          36.8k22155






















                               

                              draft saved


                              draft discarded


























                               


                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function ()
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2864563%2fan-orthogonal-map-associated-with-inner-product%23new-answer', 'question_page');

                              );

                              Post as a guest













































































                              Comments

                              Popular posts from this blog

                              What is the equation of a 3D cone with generalised tilt?

                              Color the edges and diagonals of a regular polygon

                              Relationship between determinant of matrix and determinant of adjoint?