Linearizing two variable function

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












I have another linearization question similar to the one in here. This time, I have got two variables in my equation and I am in search of an "$A+Brho$" or possibly "$A+Brho+Ctheta$" approximation. Here is my equation:



$$W = fractheta2(1-rho)$$



where $theta,rhoin mathbbR^+$ and $rhoin[0,1)$ i.e., $0leq rho <1$.



I tried to come up with "$A+Brho$", although I feel like the correct form of the linearization should be "$A+Brho+Ctheta$". I followed Leibovici's linear regression method with Taylor series.



I minimized the norm:



$$F = int_a^b left(A + B rho - fractheta2(1-rho)right)^2$$



After integration, I came up with the following two equations:



$fracpartial Wpartial A = - 2 A a + 2 A b - B a^2 + B b^2 - theta logleft (a - 1 right ) + theta logleft (b - 1 right )$
$fracpartial Wpartial B = - A a^2 + A b^2 - frac2 B a^33 + frac2 B b^33 - a theta + b theta - theta logleft (a - 1 right ) + theta logleft (b - 1 right )$



Setting $a=0.0$ and $b=0.1$, I came up with the following approximation (which is still nonlinear):



$Wapprox 0.499055theta + 0.554939thetarho$



I do not know if this makes life easier or not, but, we have the following relationship between $rho$ and $theta$:



$$rho = sum_jin J fraclambda_jmu_j$$
and
$$theta = sum_jin J fraclambda_jmu_j^2$$



Additionally, I am not really in search of a Newton/Newton-Raphson linearization, as I believe a linear line with a single point approximation does not satisfactorily represent the curve in this case. Considering $thetain mathbbR^+$, I do not think, Newton derived methods would help me.



Any recommendation is appreciated.







share|cite|improve this question























    up vote
    0
    down vote

    favorite












    I have another linearization question similar to the one in here. This time, I have got two variables in my equation and I am in search of an "$A+Brho$" or possibly "$A+Brho+Ctheta$" approximation. Here is my equation:



    $$W = fractheta2(1-rho)$$



    where $theta,rhoin mathbbR^+$ and $rhoin[0,1)$ i.e., $0leq rho <1$.



    I tried to come up with "$A+Brho$", although I feel like the correct form of the linearization should be "$A+Brho+Ctheta$". I followed Leibovici's linear regression method with Taylor series.



    I minimized the norm:



    $$F = int_a^b left(A + B rho - fractheta2(1-rho)right)^2$$



    After integration, I came up with the following two equations:



    $fracpartial Wpartial A = - 2 A a + 2 A b - B a^2 + B b^2 - theta logleft (a - 1 right ) + theta logleft (b - 1 right )$
    $fracpartial Wpartial B = - A a^2 + A b^2 - frac2 B a^33 + frac2 B b^33 - a theta + b theta - theta logleft (a - 1 right ) + theta logleft (b - 1 right )$



    Setting $a=0.0$ and $b=0.1$, I came up with the following approximation (which is still nonlinear):



    $Wapprox 0.499055theta + 0.554939thetarho$



    I do not know if this makes life easier or not, but, we have the following relationship between $rho$ and $theta$:



    $$rho = sum_jin J fraclambda_jmu_j$$
    and
    $$theta = sum_jin J fraclambda_jmu_j^2$$



    Additionally, I am not really in search of a Newton/Newton-Raphson linearization, as I believe a linear line with a single point approximation does not satisfactorily represent the curve in this case. Considering $thetain mathbbR^+$, I do not think, Newton derived methods would help me.



    Any recommendation is appreciated.







    share|cite|improve this question





















      up vote
      0
      down vote

      favorite









      up vote
      0
      down vote

      favorite











      I have another linearization question similar to the one in here. This time, I have got two variables in my equation and I am in search of an "$A+Brho$" or possibly "$A+Brho+Ctheta$" approximation. Here is my equation:



      $$W = fractheta2(1-rho)$$



      where $theta,rhoin mathbbR^+$ and $rhoin[0,1)$ i.e., $0leq rho <1$.



      I tried to come up with "$A+Brho$", although I feel like the correct form of the linearization should be "$A+Brho+Ctheta$". I followed Leibovici's linear regression method with Taylor series.



      I minimized the norm:



      $$F = int_a^b left(A + B rho - fractheta2(1-rho)right)^2$$



      After integration, I came up with the following two equations:



      $fracpartial Wpartial A = - 2 A a + 2 A b - B a^2 + B b^2 - theta logleft (a - 1 right ) + theta logleft (b - 1 right )$
      $fracpartial Wpartial B = - A a^2 + A b^2 - frac2 B a^33 + frac2 B b^33 - a theta + b theta - theta logleft (a - 1 right ) + theta logleft (b - 1 right )$



      Setting $a=0.0$ and $b=0.1$, I came up with the following approximation (which is still nonlinear):



      $Wapprox 0.499055theta + 0.554939thetarho$



      I do not know if this makes life easier or not, but, we have the following relationship between $rho$ and $theta$:



      $$rho = sum_jin J fraclambda_jmu_j$$
      and
      $$theta = sum_jin J fraclambda_jmu_j^2$$



      Additionally, I am not really in search of a Newton/Newton-Raphson linearization, as I believe a linear line with a single point approximation does not satisfactorily represent the curve in this case. Considering $thetain mathbbR^+$, I do not think, Newton derived methods would help me.



      Any recommendation is appreciated.







      share|cite|improve this question











      I have another linearization question similar to the one in here. This time, I have got two variables in my equation and I am in search of an "$A+Brho$" or possibly "$A+Brho+Ctheta$" approximation. Here is my equation:



      $$W = fractheta2(1-rho)$$



      where $theta,rhoin mathbbR^+$ and $rhoin[0,1)$ i.e., $0leq rho <1$.



      I tried to come up with "$A+Brho$", although I feel like the correct form of the linearization should be "$A+Brho+Ctheta$". I followed Leibovici's linear regression method with Taylor series.



      I minimized the norm:



      $$F = int_a^b left(A + B rho - fractheta2(1-rho)right)^2$$



      After integration, I came up with the following two equations:



      $fracpartial Wpartial A = - 2 A a + 2 A b - B a^2 + B b^2 - theta logleft (a - 1 right ) + theta logleft (b - 1 right )$
      $fracpartial Wpartial B = - A a^2 + A b^2 - frac2 B a^33 + frac2 B b^33 - a theta + b theta - theta logleft (a - 1 right ) + theta logleft (b - 1 right )$



      Setting $a=0.0$ and $b=0.1$, I came up with the following approximation (which is still nonlinear):



      $Wapprox 0.499055theta + 0.554939thetarho$



      I do not know if this makes life easier or not, but, we have the following relationship between $rho$ and $theta$:



      $$rho = sum_jin J fraclambda_jmu_j$$
      and
      $$theta = sum_jin J fraclambda_jmu_j^2$$



      Additionally, I am not really in search of a Newton/Newton-Raphson linearization, as I believe a linear line with a single point approximation does not satisfactorily represent the curve in this case. Considering $thetain mathbbR^+$, I do not think, Newton derived methods would help me.



      Any recommendation is appreciated.









      share|cite|improve this question










      share|cite|improve this question




      share|cite|improve this question









      asked Jul 25 at 18:01









      user8028576

      277




      277




















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          1
          down vote



          accepted










          Since nos you face a surface, we can consider that we need to minimize $$G = int_a^b int_c^dleft(A + B rho+C theta - fractheta2(1-rho)right)^2,drho,dtheta$$ with respect to $A,B,C$.



          I shall not reproduce here the analytical expression of neither $G$ or the partial derivatives $fracpartial Gpartial A$, $fracpartial Gpartial B$, $fracpartial Gpartial C$ (they are really messy) but the solutions are quite simple (I did not finish the simplifications).



          $$4(a-b)^3 A=-3 (a+b) (c+d) ((a+b-2) log (a-1)-(a+b-2) log (b-1)-2 a+2 b)$$
          $$2(a-b)^3 B=3 (c+d) ((a+b-2) log (a-1)-(a+b-2) log (b-1)-2 a+2 b)$$
          $$2(a-b) C=log (b-1)-log (a-1)$$



          Using $a=0$, $b=frac 110$, $c=frac 910$, $d=frac 1110$, this would lead to
          $$A=30-285 log left(frac109right)qquad B=300 left(19 log left(frac109right)-2right)qquad C=5 log left(frac109right)$$ that is to say
          $$Aapprox -0.027747 qquad B approx 0.554939 qquad C approx 0.526803$$



          In order to check the validity of the results, I generated a data set of $fractheta2(1-rho)$ with steps $Delta rho=Delta theta=frac1100$ between the selected bounds.



          A linear regression gave the following results
          $$beginarrayclclclclc
          text & textEstimate & textStandard Error & textConfidence Interval \
          A & -0.027756 & 0.00130 & -0.030326,-0.025186 \
          B & +0.555113 & 0.00248 & +0.550223,+0.560002 \
          C & +0.526900 & 0.00130 & +0.524346,+0.529454 \
          endarray$$ which seems to confirm.






          share|cite|improve this answer





















          • Dr. Leibovici, this was exactly what in my mind was. I just couldn't figure out to integrate twice, over $a-b$ and $c-d$. I really highly appreciate for your very well-written answer. Now, I exactly know what to do for 2+ variable linearization.
            – user8028576
            Jul 26 at 13:24










          Your Answer




          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "69"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: false,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );








           

          draft saved


          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2862672%2flinearizing-two-variable-function%23new-answer', 'question_page');

          );

          Post as a guest






























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          1
          down vote



          accepted










          Since nos you face a surface, we can consider that we need to minimize $$G = int_a^b int_c^dleft(A + B rho+C theta - fractheta2(1-rho)right)^2,drho,dtheta$$ with respect to $A,B,C$.



          I shall not reproduce here the analytical expression of neither $G$ or the partial derivatives $fracpartial Gpartial A$, $fracpartial Gpartial B$, $fracpartial Gpartial C$ (they are really messy) but the solutions are quite simple (I did not finish the simplifications).



          $$4(a-b)^3 A=-3 (a+b) (c+d) ((a+b-2) log (a-1)-(a+b-2) log (b-1)-2 a+2 b)$$
          $$2(a-b)^3 B=3 (c+d) ((a+b-2) log (a-1)-(a+b-2) log (b-1)-2 a+2 b)$$
          $$2(a-b) C=log (b-1)-log (a-1)$$



          Using $a=0$, $b=frac 110$, $c=frac 910$, $d=frac 1110$, this would lead to
          $$A=30-285 log left(frac109right)qquad B=300 left(19 log left(frac109right)-2right)qquad C=5 log left(frac109right)$$ that is to say
          $$Aapprox -0.027747 qquad B approx 0.554939 qquad C approx 0.526803$$



          In order to check the validity of the results, I generated a data set of $fractheta2(1-rho)$ with steps $Delta rho=Delta theta=frac1100$ between the selected bounds.



          A linear regression gave the following results
          $$beginarrayclclclclc
          text & textEstimate & textStandard Error & textConfidence Interval \
          A & -0.027756 & 0.00130 & -0.030326,-0.025186 \
          B & +0.555113 & 0.00248 & +0.550223,+0.560002 \
          C & +0.526900 & 0.00130 & +0.524346,+0.529454 \
          endarray$$ which seems to confirm.






          share|cite|improve this answer





















          • Dr. Leibovici, this was exactly what in my mind was. I just couldn't figure out to integrate twice, over $a-b$ and $c-d$. I really highly appreciate for your very well-written answer. Now, I exactly know what to do for 2+ variable linearization.
            – user8028576
            Jul 26 at 13:24














          up vote
          1
          down vote



          accepted










          Since nos you face a surface, we can consider that we need to minimize $$G = int_a^b int_c^dleft(A + B rho+C theta - fractheta2(1-rho)right)^2,drho,dtheta$$ with respect to $A,B,C$.



          I shall not reproduce here the analytical expression of neither $G$ or the partial derivatives $fracpartial Gpartial A$, $fracpartial Gpartial B$, $fracpartial Gpartial C$ (they are really messy) but the solutions are quite simple (I did not finish the simplifications).



          $$4(a-b)^3 A=-3 (a+b) (c+d) ((a+b-2) log (a-1)-(a+b-2) log (b-1)-2 a+2 b)$$
          $$2(a-b)^3 B=3 (c+d) ((a+b-2) log (a-1)-(a+b-2) log (b-1)-2 a+2 b)$$
          $$2(a-b) C=log (b-1)-log (a-1)$$



          Using $a=0$, $b=frac 110$, $c=frac 910$, $d=frac 1110$, this would lead to
          $$A=30-285 log left(frac109right)qquad B=300 left(19 log left(frac109right)-2right)qquad C=5 log left(frac109right)$$ that is to say
          $$Aapprox -0.027747 qquad B approx 0.554939 qquad C approx 0.526803$$



          In order to check the validity of the results, I generated a data set of $fractheta2(1-rho)$ with steps $Delta rho=Delta theta=frac1100$ between the selected bounds.



          A linear regression gave the following results
          $$beginarrayclclclclc
          text & textEstimate & textStandard Error & textConfidence Interval \
          A & -0.027756 & 0.00130 & -0.030326,-0.025186 \
          B & +0.555113 & 0.00248 & +0.550223,+0.560002 \
          C & +0.526900 & 0.00130 & +0.524346,+0.529454 \
          endarray$$ which seems to confirm.






          share|cite|improve this answer





















          • Dr. Leibovici, this was exactly what in my mind was. I just couldn't figure out to integrate twice, over $a-b$ and $c-d$. I really highly appreciate for your very well-written answer. Now, I exactly know what to do for 2+ variable linearization.
            – user8028576
            Jul 26 at 13:24












          up vote
          1
          down vote



          accepted







          up vote
          1
          down vote



          accepted






          Since nos you face a surface, we can consider that we need to minimize $$G = int_a^b int_c^dleft(A + B rho+C theta - fractheta2(1-rho)right)^2,drho,dtheta$$ with respect to $A,B,C$.



          I shall not reproduce here the analytical expression of neither $G$ or the partial derivatives $fracpartial Gpartial A$, $fracpartial Gpartial B$, $fracpartial Gpartial C$ (they are really messy) but the solutions are quite simple (I did not finish the simplifications).



          $$4(a-b)^3 A=-3 (a+b) (c+d) ((a+b-2) log (a-1)-(a+b-2) log (b-1)-2 a+2 b)$$
          $$2(a-b)^3 B=3 (c+d) ((a+b-2) log (a-1)-(a+b-2) log (b-1)-2 a+2 b)$$
          $$2(a-b) C=log (b-1)-log (a-1)$$



          Using $a=0$, $b=frac 110$, $c=frac 910$, $d=frac 1110$, this would lead to
          $$A=30-285 log left(frac109right)qquad B=300 left(19 log left(frac109right)-2right)qquad C=5 log left(frac109right)$$ that is to say
          $$Aapprox -0.027747 qquad B approx 0.554939 qquad C approx 0.526803$$



          In order to check the validity of the results, I generated a data set of $fractheta2(1-rho)$ with steps $Delta rho=Delta theta=frac1100$ between the selected bounds.



          A linear regression gave the following results
          $$beginarrayclclclclc
          text & textEstimate & textStandard Error & textConfidence Interval \
          A & -0.027756 & 0.00130 & -0.030326,-0.025186 \
          B & +0.555113 & 0.00248 & +0.550223,+0.560002 \
          C & +0.526900 & 0.00130 & +0.524346,+0.529454 \
          endarray$$ which seems to confirm.






          share|cite|improve this answer













          Since nos you face a surface, we can consider that we need to minimize $$G = int_a^b int_c^dleft(A + B rho+C theta - fractheta2(1-rho)right)^2,drho,dtheta$$ with respect to $A,B,C$.



          I shall not reproduce here the analytical expression of neither $G$ or the partial derivatives $fracpartial Gpartial A$, $fracpartial Gpartial B$, $fracpartial Gpartial C$ (they are really messy) but the solutions are quite simple (I did not finish the simplifications).



          $$4(a-b)^3 A=-3 (a+b) (c+d) ((a+b-2) log (a-1)-(a+b-2) log (b-1)-2 a+2 b)$$
          $$2(a-b)^3 B=3 (c+d) ((a+b-2) log (a-1)-(a+b-2) log (b-1)-2 a+2 b)$$
          $$2(a-b) C=log (b-1)-log (a-1)$$



          Using $a=0$, $b=frac 110$, $c=frac 910$, $d=frac 1110$, this would lead to
          $$A=30-285 log left(frac109right)qquad B=300 left(19 log left(frac109right)-2right)qquad C=5 log left(frac109right)$$ that is to say
          $$Aapprox -0.027747 qquad B approx 0.554939 qquad C approx 0.526803$$



          In order to check the validity of the results, I generated a data set of $fractheta2(1-rho)$ with steps $Delta rho=Delta theta=frac1100$ between the selected bounds.



          A linear regression gave the following results
          $$beginarrayclclclclc
          text & textEstimate & textStandard Error & textConfidence Interval \
          A & -0.027756 & 0.00130 & -0.030326,-0.025186 \
          B & +0.555113 & 0.00248 & +0.550223,+0.560002 \
          C & +0.526900 & 0.00130 & +0.524346,+0.529454 \
          endarray$$ which seems to confirm.







          share|cite|improve this answer













          share|cite|improve this answer



          share|cite|improve this answer











          answered Jul 26 at 8:03









          Claude Leibovici

          111k1055126




          111k1055126











          • Dr. Leibovici, this was exactly what in my mind was. I just couldn't figure out to integrate twice, over $a-b$ and $c-d$. I really highly appreciate for your very well-written answer. Now, I exactly know what to do for 2+ variable linearization.
            – user8028576
            Jul 26 at 13:24
















          • Dr. Leibovici, this was exactly what in my mind was. I just couldn't figure out to integrate twice, over $a-b$ and $c-d$. I really highly appreciate for your very well-written answer. Now, I exactly know what to do for 2+ variable linearization.
            – user8028576
            Jul 26 at 13:24















          Dr. Leibovici, this was exactly what in my mind was. I just couldn't figure out to integrate twice, over $a-b$ and $c-d$. I really highly appreciate for your very well-written answer. Now, I exactly know what to do for 2+ variable linearization.
          – user8028576
          Jul 26 at 13:24




          Dr. Leibovici, this was exactly what in my mind was. I just couldn't figure out to integrate twice, over $a-b$ and $c-d$. I really highly appreciate for your very well-written answer. Now, I exactly know what to do for 2+ variable linearization.
          – user8028576
          Jul 26 at 13:24












           

          draft saved


          draft discarded


























           


          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2862672%2flinearizing-two-variable-function%23new-answer', 'question_page');

          );

          Post as a guest













































































          Comments

          Popular posts from this blog

          What is the equation of a 3D cone with generalised tilt?

          Color the edges and diagonals of a regular polygon

          Relationship between determinant of matrix and determinant of adjoint?