Taylor approximation of function in two variables

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












Let $f(x,y)$ be a real-valued function in two variables.



Let $x_n to x_0$ and $y_n to y_0$ as $n to infty$.



We assume $f(x,y)$ is differentiable in $x$ at $(x_0, y_0)$ with derivative $f'_x(x_0,y_0)$. Further suppose $f'_x(x,y)$ is continuous at $(x_0,y_0)$.




Question: Consider the following Taylor expansion about $(x_0,y_n)$:
$$f(x_n,y_n) = f(x_0,y_n) + f'_x(x_0,y_n)(x_n - x_0) + R_n(x_n,y_n)$$



Is it neccesary true that $R_n(x_n,y_n) = o(|x_n - x_0|)$?




Clearly if all instances of $y_n$ are replaced by $y_0$, the Taylor expansion is correct based on the definition of differentiability. And if $f(x,y)$ is differentiable in both $x$ and $y$ at $(x_0,y_0)$, we have



$$f(x_n,y_n) = f(x_0,y_0) + f'_x(x_0,y_0)(x_n - x_0) + f'_y(x_0,y_0)(y_n - y_0) + R_n^*(x_n,y_n)$$



with $R^*_n(x_n,y_n) = o(|x_n - x_0| + |y_n - y_0|)$.



However, if we don't have differentiability in the two variables, I wonder if the Taylor expansion in the question is correct.







share|cite|improve this question

























    up vote
    1
    down vote

    favorite












    Let $f(x,y)$ be a real-valued function in two variables.



    Let $x_n to x_0$ and $y_n to y_0$ as $n to infty$.



    We assume $f(x,y)$ is differentiable in $x$ at $(x_0, y_0)$ with derivative $f'_x(x_0,y_0)$. Further suppose $f'_x(x,y)$ is continuous at $(x_0,y_0)$.




    Question: Consider the following Taylor expansion about $(x_0,y_n)$:
    $$f(x_n,y_n) = f(x_0,y_n) + f'_x(x_0,y_n)(x_n - x_0) + R_n(x_n,y_n)$$



    Is it neccesary true that $R_n(x_n,y_n) = o(|x_n - x_0|)$?




    Clearly if all instances of $y_n$ are replaced by $y_0$, the Taylor expansion is correct based on the definition of differentiability. And if $f(x,y)$ is differentiable in both $x$ and $y$ at $(x_0,y_0)$, we have



    $$f(x_n,y_n) = f(x_0,y_0) + f'_x(x_0,y_0)(x_n - x_0) + f'_y(x_0,y_0)(y_n - y_0) + R_n^*(x_n,y_n)$$



    with $R^*_n(x_n,y_n) = o(|x_n - x_0| + |y_n - y_0|)$.



    However, if we don't have differentiability in the two variables, I wonder if the Taylor expansion in the question is correct.







    share|cite|improve this question























      up vote
      1
      down vote

      favorite









      up vote
      1
      down vote

      favorite











      Let $f(x,y)$ be a real-valued function in two variables.



      Let $x_n to x_0$ and $y_n to y_0$ as $n to infty$.



      We assume $f(x,y)$ is differentiable in $x$ at $(x_0, y_0)$ with derivative $f'_x(x_0,y_0)$. Further suppose $f'_x(x,y)$ is continuous at $(x_0,y_0)$.




      Question: Consider the following Taylor expansion about $(x_0,y_n)$:
      $$f(x_n,y_n) = f(x_0,y_n) + f'_x(x_0,y_n)(x_n - x_0) + R_n(x_n,y_n)$$



      Is it neccesary true that $R_n(x_n,y_n) = o(|x_n - x_0|)$?




      Clearly if all instances of $y_n$ are replaced by $y_0$, the Taylor expansion is correct based on the definition of differentiability. And if $f(x,y)$ is differentiable in both $x$ and $y$ at $(x_0,y_0)$, we have



      $$f(x_n,y_n) = f(x_0,y_0) + f'_x(x_0,y_0)(x_n - x_0) + f'_y(x_0,y_0)(y_n - y_0) + R_n^*(x_n,y_n)$$



      with $R^*_n(x_n,y_n) = o(|x_n - x_0| + |y_n - y_0|)$.



      However, if we don't have differentiability in the two variables, I wonder if the Taylor expansion in the question is correct.







      share|cite|improve this question













      Let $f(x,y)$ be a real-valued function in two variables.



      Let $x_n to x_0$ and $y_n to y_0$ as $n to infty$.



      We assume $f(x,y)$ is differentiable in $x$ at $(x_0, y_0)$ with derivative $f'_x(x_0,y_0)$. Further suppose $f'_x(x,y)$ is continuous at $(x_0,y_0)$.




      Question: Consider the following Taylor expansion about $(x_0,y_n)$:
      $$f(x_n,y_n) = f(x_0,y_n) + f'_x(x_0,y_n)(x_n - x_0) + R_n(x_n,y_n)$$



      Is it neccesary true that $R_n(x_n,y_n) = o(|x_n - x_0|)$?




      Clearly if all instances of $y_n$ are replaced by $y_0$, the Taylor expansion is correct based on the definition of differentiability. And if $f(x,y)$ is differentiable in both $x$ and $y$ at $(x_0,y_0)$, we have



      $$f(x_n,y_n) = f(x_0,y_0) + f'_x(x_0,y_0)(x_n - x_0) + f'_y(x_0,y_0)(y_n - y_0) + R_n^*(x_n,y_n)$$



      with $R^*_n(x_n,y_n) = o(|x_n - x_0| + |y_n - y_0|)$.



      However, if we don't have differentiability in the two variables, I wonder if the Taylor expansion in the question is correct.









      share|cite|improve this question












      share|cite|improve this question




      share|cite|improve this question








      edited Jul 15 at 21:01
























      asked Jul 15 at 4:22









      Guillaume F.

      351211




      351211




















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          1
          down vote













          Counterexample: Define $f$ to be $0$ on the set $xcup (x,y):yle 0,$ and $f=1$ everywhere else. Let $(x_0,y_0)=(0,0).$ Then $f_x(0,y)=0$ for all $y.$ But $f(1/n,1/(2n)) = 1$ for all $n,$ which forces $R_n(1/n,1/(2n)) = 1$ for all $n.$






          share|cite|improve this answer




























            up vote
            0
            down vote



            accepted










            Yes. continuity of $f_x'(x,y)$ at $(x_0,y_0)$ is enough.



            Proof:



            Because $f_x'(x,y)$ is continuous at $(x_0,y_0)$, there exists a closed ball $mathcalB$ centered at $(x_0,y_0)$ such as $f'_x(x,y)$ exists. For $n$ high enough, both $x_n$ and $y_n$ are in the ball, and we can thus write



            $$f(x_n,y_n) = f(x_0,y_n) + f'_x(x_0,y_n)(x_n - x_0) + R_n(x_n,y_n)$$



            with



            $$fracR_n(x_n,y_n)x_n - x_0 = fracf(x_n,y_n) - f(x_0,y_n)x_n - x_0 - f'_x(x_0,y_n) $$



            We need to find a sufficient condition for




            $$ fracR_n(x_n,y_n)x_n - x_0 = o(1)$$




            For $n$ high enough so that $x_n$ and $y_n$ are in the ball $mathcalB$, we have, from the Mean Value Theorem,




            $$frac f(x_n,y_n) - f(x_0,y_n)x_n - x_0 = f'_x(tildex_n,y_n)$$




            where $tildex_n$ is between $x_n$ and $x_0$.



            Hence, together with the triangle inequality,




            $$beginalign
            left|fracR_n(x_n,y_n)x_n - x_0right|
            &= left| f'_x(tildex_n,y_n) - f'_x(x_0,y_n) right| \
            &le left| f'_x(tildex_n,y_n) - f'_x(x_0,y_0) right| + left| f'_x(x_0,y_n) - f'_x(x_0,y_0) right|\
            &= o(1)
            endalign $$




            with the last true because of the continuity of the derivative at $(x_0,y_0)$.



            This proves the result.



            Note 1: Essentially the same proof can be used when $x$ and $y$ are vectors. However, if $x$ is a vector, we need that $f$ be differentiable in $x$ in a neighborhood of $(x_0,y_0)$ to justify the use of the mean value theorem.



            Note 2: Instead of the continuity of $f_x'(x,y)$ at $(x_0,y_0)$, we can assume that at $x_0$ and $y$ on a closed ball $mathcalB$ centered at $y_0$ , we have $f_x'(x,y)$ continuous in $x$ uniformly in $y$, so that, for any $x_n to 0$,



            $$sup_y in mathcalBleft| f'_x(x_n,y) - f'_x(x_0,y) right| = o(1)$$



            In which case, for $n$ large enough so that $y_n$ is in the ball,



            $$left| f'_x(tildex_n,y_n) - f'_x(x_0,y_n) right| le sup_y in mathcalBleft| f'_x(tildex_n,y) - f'_x(x_0,y) right| = o(1)$$






            share|cite|improve this answer























              Your Answer




              StackExchange.ifUsing("editor", function ()
              return StackExchange.using("mathjaxEditing", function ()
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              );
              );
              , "mathjax-editing");

              StackExchange.ready(function()
              var channelOptions =
              tags: "".split(" "),
              id: "69"
              ;
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function()
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled)
              StackExchange.using("snippets", function()
              createEditor();
              );

              else
              createEditor();

              );

              function createEditor()
              StackExchange.prepareEditor(
              heartbeatType: 'answer',
              convertImagesToLinks: true,
              noModals: false,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              );



              );








               

              draft saved


              draft discarded


















              StackExchange.ready(
              function ()
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2852182%2ftaylor-approximation-of-function-in-two-variables%23new-answer', 'question_page');

              );

              Post as a guest






























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes








              up vote
              1
              down vote













              Counterexample: Define $f$ to be $0$ on the set $xcup (x,y):yle 0,$ and $f=1$ everywhere else. Let $(x_0,y_0)=(0,0).$ Then $f_x(0,y)=0$ for all $y.$ But $f(1/n,1/(2n)) = 1$ for all $n,$ which forces $R_n(1/n,1/(2n)) = 1$ for all $n.$






              share|cite|improve this answer

























                up vote
                1
                down vote













                Counterexample: Define $f$ to be $0$ on the set $xcup (x,y):yle 0,$ and $f=1$ everywhere else. Let $(x_0,y_0)=(0,0).$ Then $f_x(0,y)=0$ for all $y.$ But $f(1/n,1/(2n)) = 1$ for all $n,$ which forces $R_n(1/n,1/(2n)) = 1$ for all $n.$






                share|cite|improve this answer























                  up vote
                  1
                  down vote










                  up vote
                  1
                  down vote









                  Counterexample: Define $f$ to be $0$ on the set $xcup (x,y):yle 0,$ and $f=1$ everywhere else. Let $(x_0,y_0)=(0,0).$ Then $f_x(0,y)=0$ for all $y.$ But $f(1/n,1/(2n)) = 1$ for all $n,$ which forces $R_n(1/n,1/(2n)) = 1$ for all $n.$






                  share|cite|improve this answer













                  Counterexample: Define $f$ to be $0$ on the set $xcup (x,y):yle 0,$ and $f=1$ everywhere else. Let $(x_0,y_0)=(0,0).$ Then $f_x(0,y)=0$ for all $y.$ But $f(1/n,1/(2n)) = 1$ for all $n,$ which forces $R_n(1/n,1/(2n)) = 1$ for all $n.$







                  share|cite|improve this answer













                  share|cite|improve this answer



                  share|cite|improve this answer











                  answered Jul 15 at 7:11









                  zhw.

                  66k42870




                  66k42870




















                      up vote
                      0
                      down vote



                      accepted










                      Yes. continuity of $f_x'(x,y)$ at $(x_0,y_0)$ is enough.



                      Proof:



                      Because $f_x'(x,y)$ is continuous at $(x_0,y_0)$, there exists a closed ball $mathcalB$ centered at $(x_0,y_0)$ such as $f'_x(x,y)$ exists. For $n$ high enough, both $x_n$ and $y_n$ are in the ball, and we can thus write



                      $$f(x_n,y_n) = f(x_0,y_n) + f'_x(x_0,y_n)(x_n - x_0) + R_n(x_n,y_n)$$



                      with



                      $$fracR_n(x_n,y_n)x_n - x_0 = fracf(x_n,y_n) - f(x_0,y_n)x_n - x_0 - f'_x(x_0,y_n) $$



                      We need to find a sufficient condition for




                      $$ fracR_n(x_n,y_n)x_n - x_0 = o(1)$$




                      For $n$ high enough so that $x_n$ and $y_n$ are in the ball $mathcalB$, we have, from the Mean Value Theorem,




                      $$frac f(x_n,y_n) - f(x_0,y_n)x_n - x_0 = f'_x(tildex_n,y_n)$$




                      where $tildex_n$ is between $x_n$ and $x_0$.



                      Hence, together with the triangle inequality,




                      $$beginalign
                      left|fracR_n(x_n,y_n)x_n - x_0right|
                      &= left| f'_x(tildex_n,y_n) - f'_x(x_0,y_n) right| \
                      &le left| f'_x(tildex_n,y_n) - f'_x(x_0,y_0) right| + left| f'_x(x_0,y_n) - f'_x(x_0,y_0) right|\
                      &= o(1)
                      endalign $$




                      with the last true because of the continuity of the derivative at $(x_0,y_0)$.



                      This proves the result.



                      Note 1: Essentially the same proof can be used when $x$ and $y$ are vectors. However, if $x$ is a vector, we need that $f$ be differentiable in $x$ in a neighborhood of $(x_0,y_0)$ to justify the use of the mean value theorem.



                      Note 2: Instead of the continuity of $f_x'(x,y)$ at $(x_0,y_0)$, we can assume that at $x_0$ and $y$ on a closed ball $mathcalB$ centered at $y_0$ , we have $f_x'(x,y)$ continuous in $x$ uniformly in $y$, so that, for any $x_n to 0$,



                      $$sup_y in mathcalBleft| f'_x(x_n,y) - f'_x(x_0,y) right| = o(1)$$



                      In which case, for $n$ large enough so that $y_n$ is in the ball,



                      $$left| f'_x(tildex_n,y_n) - f'_x(x_0,y_n) right| le sup_y in mathcalBleft| f'_x(tildex_n,y) - f'_x(x_0,y) right| = o(1)$$






                      share|cite|improve this answer



























                        up vote
                        0
                        down vote



                        accepted










                        Yes. continuity of $f_x'(x,y)$ at $(x_0,y_0)$ is enough.



                        Proof:



                        Because $f_x'(x,y)$ is continuous at $(x_0,y_0)$, there exists a closed ball $mathcalB$ centered at $(x_0,y_0)$ such as $f'_x(x,y)$ exists. For $n$ high enough, both $x_n$ and $y_n$ are in the ball, and we can thus write



                        $$f(x_n,y_n) = f(x_0,y_n) + f'_x(x_0,y_n)(x_n - x_0) + R_n(x_n,y_n)$$



                        with



                        $$fracR_n(x_n,y_n)x_n - x_0 = fracf(x_n,y_n) - f(x_0,y_n)x_n - x_0 - f'_x(x_0,y_n) $$



                        We need to find a sufficient condition for




                        $$ fracR_n(x_n,y_n)x_n - x_0 = o(1)$$




                        For $n$ high enough so that $x_n$ and $y_n$ are in the ball $mathcalB$, we have, from the Mean Value Theorem,




                        $$frac f(x_n,y_n) - f(x_0,y_n)x_n - x_0 = f'_x(tildex_n,y_n)$$




                        where $tildex_n$ is between $x_n$ and $x_0$.



                        Hence, together with the triangle inequality,




                        $$beginalign
                        left|fracR_n(x_n,y_n)x_n - x_0right|
                        &= left| f'_x(tildex_n,y_n) - f'_x(x_0,y_n) right| \
                        &le left| f'_x(tildex_n,y_n) - f'_x(x_0,y_0) right| + left| f'_x(x_0,y_n) - f'_x(x_0,y_0) right|\
                        &= o(1)
                        endalign $$




                        with the last true because of the continuity of the derivative at $(x_0,y_0)$.



                        This proves the result.



                        Note 1: Essentially the same proof can be used when $x$ and $y$ are vectors. However, if $x$ is a vector, we need that $f$ be differentiable in $x$ in a neighborhood of $(x_0,y_0)$ to justify the use of the mean value theorem.



                        Note 2: Instead of the continuity of $f_x'(x,y)$ at $(x_0,y_0)$, we can assume that at $x_0$ and $y$ on a closed ball $mathcalB$ centered at $y_0$ , we have $f_x'(x,y)$ continuous in $x$ uniformly in $y$, so that, for any $x_n to 0$,



                        $$sup_y in mathcalBleft| f'_x(x_n,y) - f'_x(x_0,y) right| = o(1)$$



                        In which case, for $n$ large enough so that $y_n$ is in the ball,



                        $$left| f'_x(tildex_n,y_n) - f'_x(x_0,y_n) right| le sup_y in mathcalBleft| f'_x(tildex_n,y) - f'_x(x_0,y) right| = o(1)$$






                        share|cite|improve this answer

























                          up vote
                          0
                          down vote



                          accepted







                          up vote
                          0
                          down vote



                          accepted






                          Yes. continuity of $f_x'(x,y)$ at $(x_0,y_0)$ is enough.



                          Proof:



                          Because $f_x'(x,y)$ is continuous at $(x_0,y_0)$, there exists a closed ball $mathcalB$ centered at $(x_0,y_0)$ such as $f'_x(x,y)$ exists. For $n$ high enough, both $x_n$ and $y_n$ are in the ball, and we can thus write



                          $$f(x_n,y_n) = f(x_0,y_n) + f'_x(x_0,y_n)(x_n - x_0) + R_n(x_n,y_n)$$



                          with



                          $$fracR_n(x_n,y_n)x_n - x_0 = fracf(x_n,y_n) - f(x_0,y_n)x_n - x_0 - f'_x(x_0,y_n) $$



                          We need to find a sufficient condition for




                          $$ fracR_n(x_n,y_n)x_n - x_0 = o(1)$$




                          For $n$ high enough so that $x_n$ and $y_n$ are in the ball $mathcalB$, we have, from the Mean Value Theorem,




                          $$frac f(x_n,y_n) - f(x_0,y_n)x_n - x_0 = f'_x(tildex_n,y_n)$$




                          where $tildex_n$ is between $x_n$ and $x_0$.



                          Hence, together with the triangle inequality,




                          $$beginalign
                          left|fracR_n(x_n,y_n)x_n - x_0right|
                          &= left| f'_x(tildex_n,y_n) - f'_x(x_0,y_n) right| \
                          &le left| f'_x(tildex_n,y_n) - f'_x(x_0,y_0) right| + left| f'_x(x_0,y_n) - f'_x(x_0,y_0) right|\
                          &= o(1)
                          endalign $$




                          with the last true because of the continuity of the derivative at $(x_0,y_0)$.



                          This proves the result.



                          Note 1: Essentially the same proof can be used when $x$ and $y$ are vectors. However, if $x$ is a vector, we need that $f$ be differentiable in $x$ in a neighborhood of $(x_0,y_0)$ to justify the use of the mean value theorem.



                          Note 2: Instead of the continuity of $f_x'(x,y)$ at $(x_0,y_0)$, we can assume that at $x_0$ and $y$ on a closed ball $mathcalB$ centered at $y_0$ , we have $f_x'(x,y)$ continuous in $x$ uniformly in $y$, so that, for any $x_n to 0$,



                          $$sup_y in mathcalBleft| f'_x(x_n,y) - f'_x(x_0,y) right| = o(1)$$



                          In which case, for $n$ large enough so that $y_n$ is in the ball,



                          $$left| f'_x(tildex_n,y_n) - f'_x(x_0,y_n) right| le sup_y in mathcalBleft| f'_x(tildex_n,y) - f'_x(x_0,y) right| = o(1)$$






                          share|cite|improve this answer















                          Yes. continuity of $f_x'(x,y)$ at $(x_0,y_0)$ is enough.



                          Proof:



                          Because $f_x'(x,y)$ is continuous at $(x_0,y_0)$, there exists a closed ball $mathcalB$ centered at $(x_0,y_0)$ such as $f'_x(x,y)$ exists. For $n$ high enough, both $x_n$ and $y_n$ are in the ball, and we can thus write



                          $$f(x_n,y_n) = f(x_0,y_n) + f'_x(x_0,y_n)(x_n - x_0) + R_n(x_n,y_n)$$



                          with



                          $$fracR_n(x_n,y_n)x_n - x_0 = fracf(x_n,y_n) - f(x_0,y_n)x_n - x_0 - f'_x(x_0,y_n) $$



                          We need to find a sufficient condition for




                          $$ fracR_n(x_n,y_n)x_n - x_0 = o(1)$$




                          For $n$ high enough so that $x_n$ and $y_n$ are in the ball $mathcalB$, we have, from the Mean Value Theorem,




                          $$frac f(x_n,y_n) - f(x_0,y_n)x_n - x_0 = f'_x(tildex_n,y_n)$$




                          where $tildex_n$ is between $x_n$ and $x_0$.



                          Hence, together with the triangle inequality,




                          $$beginalign
                          left|fracR_n(x_n,y_n)x_n - x_0right|
                          &= left| f'_x(tildex_n,y_n) - f'_x(x_0,y_n) right| \
                          &le left| f'_x(tildex_n,y_n) - f'_x(x_0,y_0) right| + left| f'_x(x_0,y_n) - f'_x(x_0,y_0) right|\
                          &= o(1)
                          endalign $$




                          with the last true because of the continuity of the derivative at $(x_0,y_0)$.



                          This proves the result.



                          Note 1: Essentially the same proof can be used when $x$ and $y$ are vectors. However, if $x$ is a vector, we need that $f$ be differentiable in $x$ in a neighborhood of $(x_0,y_0)$ to justify the use of the mean value theorem.



                          Note 2: Instead of the continuity of $f_x'(x,y)$ at $(x_0,y_0)$, we can assume that at $x_0$ and $y$ on a closed ball $mathcalB$ centered at $y_0$ , we have $f_x'(x,y)$ continuous in $x$ uniformly in $y$, so that, for any $x_n to 0$,



                          $$sup_y in mathcalBleft| f'_x(x_n,y) - f'_x(x_0,y) right| = o(1)$$



                          In which case, for $n$ large enough so that $y_n$ is in the ball,



                          $$left| f'_x(tildex_n,y_n) - f'_x(x_0,y_n) right| le sup_y in mathcalBleft| f'_x(tildex_n,y) - f'_x(x_0,y) right| = o(1)$$







                          share|cite|improve this answer















                          share|cite|improve this answer



                          share|cite|improve this answer








                          edited Jul 17 at 20:18


























                          answered Jul 15 at 4:37









                          Guillaume F.

                          351211




                          351211






















                               

                              draft saved


                              draft discarded


























                               


                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function ()
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2852182%2ftaylor-approximation-of-function-in-two-variables%23new-answer', 'question_page');

                              );

                              Post as a guest













































































                              Comments

                              Popular posts from this blog

                              What is the equation of a 3D cone with generalised tilt?

                              Relationship between determinant of matrix and determinant of adjoint?

                              Color the edges and diagonals of a regular polygon