Definition of Rényi cross entropy with a prior

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












[Note: This is a continuation of my earlier question Priors in Shannon and Rényi entropies. Please see the earlier question for introduction to notations.]



I wanted to extend the prior $lambda(theta)$ to Rényi cross-entropy. There seem to be multiple expressions for Rényi cross-entropy e.g. here and here. As $alpharightarrow1$, these expressions reduce to Shannon cross entropy $$H(p,q) = -sumlimits_x p_theta(x) log q_theta^'(x).$$



I am interested in the Rényi cross-entropy expression that is obtained from the Sundaresan's divergence given by
$$I_alpha(p_theta, q_theta^') = fracalpha1-alpha logsumlimits_x p_theta(x) (q_theta^'(x))^alpha-1 + log sumlimits_x(q_theta^'(x))^alpha - frac11-alphalogBigg(sumlimits_x p_theta(x)^alphaBigg) $$



The last term is negative of Rényi entropy. Since Sundaresan's divergence reduces to Kullback-Leibler's as $alpharightarrow1$, the first two terms of $I_alpha(p_theta, q_theta^')$ should be equivalent to Shannon's cross-entropy. Therefore, collecting the first two terms, the Rényi cross-entropy apparently is
$$H_alpha(p,q) = fracalpha1-alpha logsumlimits_x p_theta(x) (q_theta^'(x))^alpha-1 + log sumlimits_x(q_theta^'(x))^alpha\=fracalpha1-alphalogsumlimits_x p_theta(x) Bigg(fracq_theta^'(x)Bigg)^alpha-1,$$
where $||q_theta^'(x)|| = (sumlimits_x(q_theta^'(x))^alpha)^1/alpha$. Through the application of L'Hopital rule, it can be shown that as $alpharightarrow1$, $H_alpha(p,q)$ approaches $H(p,q)$.



Accounting for the priors on parameters $theta$, the Shannon cross-entropy is obtained by multiplying the respective pdfs with the priors, i.e.,
$$H(p,q;lambda) = -sumlimits_x lambda(theta)p_theta(x) log (lambda(theta^')q_theta^'(x)).$$



In case of Rényi cross-entropy, if I multiply the pdfs by the priors , then the resulting expression does not reduce to $H(p,q;lambda)$ as $alpharightarrow1$. The solution suggested by heropup in response to my earlier question, i.e., multiplying by the factor $lambda(theta)$ before the log-sum, does not work here.



What am I missing? Any help will be greatly appreciated.



-RD







share|cite|improve this question

























    up vote
    0
    down vote

    favorite












    [Note: This is a continuation of my earlier question Priors in Shannon and Rényi entropies. Please see the earlier question for introduction to notations.]



    I wanted to extend the prior $lambda(theta)$ to Rényi cross-entropy. There seem to be multiple expressions for Rényi cross-entropy e.g. here and here. As $alpharightarrow1$, these expressions reduce to Shannon cross entropy $$H(p,q) = -sumlimits_x p_theta(x) log q_theta^'(x).$$



    I am interested in the Rényi cross-entropy expression that is obtained from the Sundaresan's divergence given by
    $$I_alpha(p_theta, q_theta^') = fracalpha1-alpha logsumlimits_x p_theta(x) (q_theta^'(x))^alpha-1 + log sumlimits_x(q_theta^'(x))^alpha - frac11-alphalogBigg(sumlimits_x p_theta(x)^alphaBigg) $$



    The last term is negative of Rényi entropy. Since Sundaresan's divergence reduces to Kullback-Leibler's as $alpharightarrow1$, the first two terms of $I_alpha(p_theta, q_theta^')$ should be equivalent to Shannon's cross-entropy. Therefore, collecting the first two terms, the Rényi cross-entropy apparently is
    $$H_alpha(p,q) = fracalpha1-alpha logsumlimits_x p_theta(x) (q_theta^'(x))^alpha-1 + log sumlimits_x(q_theta^'(x))^alpha\=fracalpha1-alphalogsumlimits_x p_theta(x) Bigg(fracq_theta^'(x)Bigg)^alpha-1,$$
    where $||q_theta^'(x)|| = (sumlimits_x(q_theta^'(x))^alpha)^1/alpha$. Through the application of L'Hopital rule, it can be shown that as $alpharightarrow1$, $H_alpha(p,q)$ approaches $H(p,q)$.



    Accounting for the priors on parameters $theta$, the Shannon cross-entropy is obtained by multiplying the respective pdfs with the priors, i.e.,
    $$H(p,q;lambda) = -sumlimits_x lambda(theta)p_theta(x) log (lambda(theta^')q_theta^'(x)).$$



    In case of Rényi cross-entropy, if I multiply the pdfs by the priors , then the resulting expression does not reduce to $H(p,q;lambda)$ as $alpharightarrow1$. The solution suggested by heropup in response to my earlier question, i.e., multiplying by the factor $lambda(theta)$ before the log-sum, does not work here.



    What am I missing? Any help will be greatly appreciated.



    -RD







    share|cite|improve this question























      up vote
      0
      down vote

      favorite









      up vote
      0
      down vote

      favorite











      [Note: This is a continuation of my earlier question Priors in Shannon and Rényi entropies. Please see the earlier question for introduction to notations.]



      I wanted to extend the prior $lambda(theta)$ to Rényi cross-entropy. There seem to be multiple expressions for Rényi cross-entropy e.g. here and here. As $alpharightarrow1$, these expressions reduce to Shannon cross entropy $$H(p,q) = -sumlimits_x p_theta(x) log q_theta^'(x).$$



      I am interested in the Rényi cross-entropy expression that is obtained from the Sundaresan's divergence given by
      $$I_alpha(p_theta, q_theta^') = fracalpha1-alpha logsumlimits_x p_theta(x) (q_theta^'(x))^alpha-1 + log sumlimits_x(q_theta^'(x))^alpha - frac11-alphalogBigg(sumlimits_x p_theta(x)^alphaBigg) $$



      The last term is negative of Rényi entropy. Since Sundaresan's divergence reduces to Kullback-Leibler's as $alpharightarrow1$, the first two terms of $I_alpha(p_theta, q_theta^')$ should be equivalent to Shannon's cross-entropy. Therefore, collecting the first two terms, the Rényi cross-entropy apparently is
      $$H_alpha(p,q) = fracalpha1-alpha logsumlimits_x p_theta(x) (q_theta^'(x))^alpha-1 + log sumlimits_x(q_theta^'(x))^alpha\=fracalpha1-alphalogsumlimits_x p_theta(x) Bigg(fracq_theta^'(x)Bigg)^alpha-1,$$
      where $||q_theta^'(x)|| = (sumlimits_x(q_theta^'(x))^alpha)^1/alpha$. Through the application of L'Hopital rule, it can be shown that as $alpharightarrow1$, $H_alpha(p,q)$ approaches $H(p,q)$.



      Accounting for the priors on parameters $theta$, the Shannon cross-entropy is obtained by multiplying the respective pdfs with the priors, i.e.,
      $$H(p,q;lambda) = -sumlimits_x lambda(theta)p_theta(x) log (lambda(theta^')q_theta^'(x)).$$



      In case of Rényi cross-entropy, if I multiply the pdfs by the priors , then the resulting expression does not reduce to $H(p,q;lambda)$ as $alpharightarrow1$. The solution suggested by heropup in response to my earlier question, i.e., multiplying by the factor $lambda(theta)$ before the log-sum, does not work here.



      What am I missing? Any help will be greatly appreciated.



      -RD







      share|cite|improve this question













      [Note: This is a continuation of my earlier question Priors in Shannon and Rényi entropies. Please see the earlier question for introduction to notations.]



      I wanted to extend the prior $lambda(theta)$ to Rényi cross-entropy. There seem to be multiple expressions for Rényi cross-entropy e.g. here and here. As $alpharightarrow1$, these expressions reduce to Shannon cross entropy $$H(p,q) = -sumlimits_x p_theta(x) log q_theta^'(x).$$



      I am interested in the Rényi cross-entropy expression that is obtained from the Sundaresan's divergence given by
      $$I_alpha(p_theta, q_theta^') = fracalpha1-alpha logsumlimits_x p_theta(x) (q_theta^'(x))^alpha-1 + log sumlimits_x(q_theta^'(x))^alpha - frac11-alphalogBigg(sumlimits_x p_theta(x)^alphaBigg) $$



      The last term is negative of Rényi entropy. Since Sundaresan's divergence reduces to Kullback-Leibler's as $alpharightarrow1$, the first two terms of $I_alpha(p_theta, q_theta^')$ should be equivalent to Shannon's cross-entropy. Therefore, collecting the first two terms, the Rényi cross-entropy apparently is
      $$H_alpha(p,q) = fracalpha1-alpha logsumlimits_x p_theta(x) (q_theta^'(x))^alpha-1 + log sumlimits_x(q_theta^'(x))^alpha\=fracalpha1-alphalogsumlimits_x p_theta(x) Bigg(fracq_theta^'(x)Bigg)^alpha-1,$$
      where $||q_theta^'(x)|| = (sumlimits_x(q_theta^'(x))^alpha)^1/alpha$. Through the application of L'Hopital rule, it can be shown that as $alpharightarrow1$, $H_alpha(p,q)$ approaches $H(p,q)$.



      Accounting for the priors on parameters $theta$, the Shannon cross-entropy is obtained by multiplying the respective pdfs with the priors, i.e.,
      $$H(p,q;lambda) = -sumlimits_x lambda(theta)p_theta(x) log (lambda(theta^')q_theta^'(x)).$$



      In case of Rényi cross-entropy, if I multiply the pdfs by the priors , then the resulting expression does not reduce to $H(p,q;lambda)$ as $alpharightarrow1$. The solution suggested by heropup in response to my earlier question, i.e., multiplying by the factor $lambda(theta)$ before the log-sum, does not work here.



      What am I missing? Any help will be greatly appreciated.



      -RD









      share|cite|improve this question












      share|cite|improve this question




      share|cite|improve this question








      edited Jul 28 at 14:09
























      asked Jul 28 at 12:27









      r2d2

      235




      235

























          active

          oldest

          votes











          Your Answer




          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "69"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: false,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );








           

          draft saved


          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2865212%2fdefinition-of-r%25c3%25a9nyi-cross-entropy-with-a-prior%23new-answer', 'question_page');

          );

          Post as a guest



































          active

          oldest

          votes













          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes










           

          draft saved


          draft discarded


























           


          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2865212%2fdefinition-of-r%25c3%25a9nyi-cross-entropy-with-a-prior%23new-answer', 'question_page');

          );

          Post as a guest













































































          Comments

          Popular posts from this blog

          Color the edges and diagonals of a regular polygon

          Relationship between determinant of matrix and determinant of adjoint?

          What is the equation of a 3D cone with generalised tilt?