Optimization and splitting the problem by dependent/independent variables

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












I have the following nonlinear function:



$$f_(a,b,c,d)$$



and measurements :
$$f_measured^i$$



for $i = 1, 2, 3, 4 ...$



The problem is defined as minimization of :
$$min_a,b,c,dbigg(sum_i=1^N (f_measured^i - f_(a,b,c,d))^2bigg) tag1$$



Further, I know, that I can obtain specific measurement, where the following holds:



$$a = f_(b,c,d)$$



and



$$b = f_(c,d)$$



therefore, splitting the variables into independent $c,d$ and dependent $a,b$. Then, the optimization problem can be reformulated as :
$$min_c,dbigg(sum_i=1^N (f_measured^i - f_(a,b,c,d))^2bigg) tag2$$
with
$a = f_(b,c,d)$, $b = f_(c,d)$.




My understanding:



By providing extra information (measurements), and distinguishing independent $c,d$ and dependent $a,b$ variables, the error surface of $(1)$ is simplified. There is no need to search the error surface in $a,b$ dimensions.



Since the problem is nonlinear, I will use some gradient descent algorithm.




QUESTION



  1. By providing more information about the problem (providing more measurements), so that the variables are split into independent $c,d$ and dependent $a,b$, is the $(2)$ still valid optimization problem ?
    OR IN ANOTHER WORDS, is it possible to limit the search dimensions of the error surface ?


  2. Does limiting the search dimensions (by explicit relationship of independent and dependent variables) introduce local optimums into the error surface (e.g. a saddle point becomes a valley) ?



P.S.



The error surface is smooth, locally convex around the global minimum, second order derivative is available (e.g. for Hessian)







share|cite|improve this question























    up vote
    0
    down vote

    favorite












    I have the following nonlinear function:



    $$f_(a,b,c,d)$$



    and measurements :
    $$f_measured^i$$



    for $i = 1, 2, 3, 4 ...$



    The problem is defined as minimization of :
    $$min_a,b,c,dbigg(sum_i=1^N (f_measured^i - f_(a,b,c,d))^2bigg) tag1$$



    Further, I know, that I can obtain specific measurement, where the following holds:



    $$a = f_(b,c,d)$$



    and



    $$b = f_(c,d)$$



    therefore, splitting the variables into independent $c,d$ and dependent $a,b$. Then, the optimization problem can be reformulated as :
    $$min_c,dbigg(sum_i=1^N (f_measured^i - f_(a,b,c,d))^2bigg) tag2$$
    with
    $a = f_(b,c,d)$, $b = f_(c,d)$.




    My understanding:



    By providing extra information (measurements), and distinguishing independent $c,d$ and dependent $a,b$ variables, the error surface of $(1)$ is simplified. There is no need to search the error surface in $a,b$ dimensions.



    Since the problem is nonlinear, I will use some gradient descent algorithm.




    QUESTION



    1. By providing more information about the problem (providing more measurements), so that the variables are split into independent $c,d$ and dependent $a,b$, is the $(2)$ still valid optimization problem ?
      OR IN ANOTHER WORDS, is it possible to limit the search dimensions of the error surface ?


    2. Does limiting the search dimensions (by explicit relationship of independent and dependent variables) introduce local optimums into the error surface (e.g. a saddle point becomes a valley) ?



    P.S.



    The error surface is smooth, locally convex around the global minimum, second order derivative is available (e.g. for Hessian)







    share|cite|improve this question





















      up vote
      0
      down vote

      favorite









      up vote
      0
      down vote

      favorite











      I have the following nonlinear function:



      $$f_(a,b,c,d)$$



      and measurements :
      $$f_measured^i$$



      for $i = 1, 2, 3, 4 ...$



      The problem is defined as minimization of :
      $$min_a,b,c,dbigg(sum_i=1^N (f_measured^i - f_(a,b,c,d))^2bigg) tag1$$



      Further, I know, that I can obtain specific measurement, where the following holds:



      $$a = f_(b,c,d)$$



      and



      $$b = f_(c,d)$$



      therefore, splitting the variables into independent $c,d$ and dependent $a,b$. Then, the optimization problem can be reformulated as :
      $$min_c,dbigg(sum_i=1^N (f_measured^i - f_(a,b,c,d))^2bigg) tag2$$
      with
      $a = f_(b,c,d)$, $b = f_(c,d)$.




      My understanding:



      By providing extra information (measurements), and distinguishing independent $c,d$ and dependent $a,b$ variables, the error surface of $(1)$ is simplified. There is no need to search the error surface in $a,b$ dimensions.



      Since the problem is nonlinear, I will use some gradient descent algorithm.




      QUESTION



      1. By providing more information about the problem (providing more measurements), so that the variables are split into independent $c,d$ and dependent $a,b$, is the $(2)$ still valid optimization problem ?
        OR IN ANOTHER WORDS, is it possible to limit the search dimensions of the error surface ?


      2. Does limiting the search dimensions (by explicit relationship of independent and dependent variables) introduce local optimums into the error surface (e.g. a saddle point becomes a valley) ?



      P.S.



      The error surface is smooth, locally convex around the global minimum, second order derivative is available (e.g. for Hessian)







      share|cite|improve this question











      I have the following nonlinear function:



      $$f_(a,b,c,d)$$



      and measurements :
      $$f_measured^i$$



      for $i = 1, 2, 3, 4 ...$



      The problem is defined as minimization of :
      $$min_a,b,c,dbigg(sum_i=1^N (f_measured^i - f_(a,b,c,d))^2bigg) tag1$$



      Further, I know, that I can obtain specific measurement, where the following holds:



      $$a = f_(b,c,d)$$



      and



      $$b = f_(c,d)$$



      therefore, splitting the variables into independent $c,d$ and dependent $a,b$. Then, the optimization problem can be reformulated as :
      $$min_c,dbigg(sum_i=1^N (f_measured^i - f_(a,b,c,d))^2bigg) tag2$$
      with
      $a = f_(b,c,d)$, $b = f_(c,d)$.




      My understanding:



      By providing extra information (measurements), and distinguishing independent $c,d$ and dependent $a,b$ variables, the error surface of $(1)$ is simplified. There is no need to search the error surface in $a,b$ dimensions.



      Since the problem is nonlinear, I will use some gradient descent algorithm.




      QUESTION



      1. By providing more information about the problem (providing more measurements), so that the variables are split into independent $c,d$ and dependent $a,b$, is the $(2)$ still valid optimization problem ?
        OR IN ANOTHER WORDS, is it possible to limit the search dimensions of the error surface ?


      2. Does limiting the search dimensions (by explicit relationship of independent and dependent variables) introduce local optimums into the error surface (e.g. a saddle point becomes a valley) ?



      P.S.



      The error surface is smooth, locally convex around the global minimum, second order derivative is available (e.g. for Hessian)









      share|cite|improve this question










      share|cite|improve this question




      share|cite|improve this question









      asked Jul 30 at 3:25









      Martin G

      377




      377

























          active

          oldest

          votes











          Your Answer




          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "69"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: false,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );








           

          draft saved


          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2866633%2foptimization-and-splitting-the-problem-by-dependent-independent-variables%23new-answer', 'question_page');

          );

          Post as a guest



































          active

          oldest

          votes













          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes










           

          draft saved


          draft discarded


























           


          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2866633%2foptimization-and-splitting-the-problem-by-dependent-independent-variables%23new-answer', 'question_page');

          );

          Post as a guest













































































          Comments

          Popular posts from this blog

          Color the edges and diagonals of a regular polygon

          Relationship between determinant of matrix and determinant of adjoint?

          What is the equation of a 3D cone with generalised tilt?