Definition of Rényi cross entropy with a prior
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
[Note: This is a continuation of my earlier question Priors in Shannon and Rényi entropies. Please see the earlier question for introduction to notations.]
I wanted to extend the prior $lambda(theta)$ to Rényi cross-entropy. There seem to be multiple expressions for Rényi cross-entropy e.g. here and here. As $alpharightarrow1$, these expressions reduce to Shannon cross entropy $$H(p,q) = -sumlimits_x p_theta(x) log q_theta^'(x).$$
I am interested in the Rényi cross-entropy expression that is obtained from the Sundaresan's divergence given by
$$I_alpha(p_theta, q_theta^') = fracalpha1-alpha logsumlimits_x p_theta(x) (q_theta^'(x))^alpha-1 + log sumlimits_x(q_theta^'(x))^alpha - frac11-alphalogBigg(sumlimits_x p_theta(x)^alphaBigg) $$
The last term is negative of Rényi entropy. Since Sundaresan's divergence reduces to Kullback-Leibler's as $alpharightarrow1$, the first two terms of $I_alpha(p_theta, q_theta^')$ should be equivalent to Shannon's cross-entropy. Therefore, collecting the first two terms, the Rényi cross-entropy apparently is
$$H_alpha(p,q) = fracalpha1-alpha logsumlimits_x p_theta(x) (q_theta^'(x))^alpha-1 + log sumlimits_x(q_theta^'(x))^alpha\=fracalpha1-alphalogsumlimits_x p_theta(x) Bigg(fracq_theta^'(x)Bigg)^alpha-1,$$
where $||q_theta^'(x)|| = (sumlimits_x(q_theta^'(x))^alpha)^1/alpha$. Through the application of L'Hopital rule, it can be shown that as $alpharightarrow1$, $H_alpha(p,q)$ approaches $H(p,q)$.
Accounting for the priors on parameters $theta$, the Shannon cross-entropy is obtained by multiplying the respective pdfs with the priors, i.e.,
$$H(p,q;lambda) = -sumlimits_x lambda(theta)p_theta(x) log (lambda(theta^')q_theta^'(x)).$$
In case of Rényi cross-entropy, if I multiply the pdfs by the priors , then the resulting expression does not reduce to $H(p,q;lambda)$ as $alpharightarrow1$. The solution suggested by heropup in response to my earlier question, i.e., multiplying by the factor $lambda(theta)$ before the log-sum, does not work here.
What am I missing? Any help will be greatly appreciated.
-RD
probability-distributions bayesian entropy
add a comment |Â
up vote
0
down vote
favorite
[Note: This is a continuation of my earlier question Priors in Shannon and Rényi entropies. Please see the earlier question for introduction to notations.]
I wanted to extend the prior $lambda(theta)$ to Rényi cross-entropy. There seem to be multiple expressions for Rényi cross-entropy e.g. here and here. As $alpharightarrow1$, these expressions reduce to Shannon cross entropy $$H(p,q) = -sumlimits_x p_theta(x) log q_theta^'(x).$$
I am interested in the Rényi cross-entropy expression that is obtained from the Sundaresan's divergence given by
$$I_alpha(p_theta, q_theta^') = fracalpha1-alpha logsumlimits_x p_theta(x) (q_theta^'(x))^alpha-1 + log sumlimits_x(q_theta^'(x))^alpha - frac11-alphalogBigg(sumlimits_x p_theta(x)^alphaBigg) $$
The last term is negative of Rényi entropy. Since Sundaresan's divergence reduces to Kullback-Leibler's as $alpharightarrow1$, the first two terms of $I_alpha(p_theta, q_theta^')$ should be equivalent to Shannon's cross-entropy. Therefore, collecting the first two terms, the Rényi cross-entropy apparently is
$$H_alpha(p,q) = fracalpha1-alpha logsumlimits_x p_theta(x) (q_theta^'(x))^alpha-1 + log sumlimits_x(q_theta^'(x))^alpha\=fracalpha1-alphalogsumlimits_x p_theta(x) Bigg(fracq_theta^'(x)Bigg)^alpha-1,$$
where $||q_theta^'(x)|| = (sumlimits_x(q_theta^'(x))^alpha)^1/alpha$. Through the application of L'Hopital rule, it can be shown that as $alpharightarrow1$, $H_alpha(p,q)$ approaches $H(p,q)$.
Accounting for the priors on parameters $theta$, the Shannon cross-entropy is obtained by multiplying the respective pdfs with the priors, i.e.,
$$H(p,q;lambda) = -sumlimits_x lambda(theta)p_theta(x) log (lambda(theta^')q_theta^'(x)).$$
In case of Rényi cross-entropy, if I multiply the pdfs by the priors , then the resulting expression does not reduce to $H(p,q;lambda)$ as $alpharightarrow1$. The solution suggested by heropup in response to my earlier question, i.e., multiplying by the factor $lambda(theta)$ before the log-sum, does not work here.
What am I missing? Any help will be greatly appreciated.
-RD
probability-distributions bayesian entropy
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
[Note: This is a continuation of my earlier question Priors in Shannon and Rényi entropies. Please see the earlier question for introduction to notations.]
I wanted to extend the prior $lambda(theta)$ to Rényi cross-entropy. There seem to be multiple expressions for Rényi cross-entropy e.g. here and here. As $alpharightarrow1$, these expressions reduce to Shannon cross entropy $$H(p,q) = -sumlimits_x p_theta(x) log q_theta^'(x).$$
I am interested in the Rényi cross-entropy expression that is obtained from the Sundaresan's divergence given by
$$I_alpha(p_theta, q_theta^') = fracalpha1-alpha logsumlimits_x p_theta(x) (q_theta^'(x))^alpha-1 + log sumlimits_x(q_theta^'(x))^alpha - frac11-alphalogBigg(sumlimits_x p_theta(x)^alphaBigg) $$
The last term is negative of Rényi entropy. Since Sundaresan's divergence reduces to Kullback-Leibler's as $alpharightarrow1$, the first two terms of $I_alpha(p_theta, q_theta^')$ should be equivalent to Shannon's cross-entropy. Therefore, collecting the first two terms, the Rényi cross-entropy apparently is
$$H_alpha(p,q) = fracalpha1-alpha logsumlimits_x p_theta(x) (q_theta^'(x))^alpha-1 + log sumlimits_x(q_theta^'(x))^alpha\=fracalpha1-alphalogsumlimits_x p_theta(x) Bigg(fracq_theta^'(x)Bigg)^alpha-1,$$
where $||q_theta^'(x)|| = (sumlimits_x(q_theta^'(x))^alpha)^1/alpha$. Through the application of L'Hopital rule, it can be shown that as $alpharightarrow1$, $H_alpha(p,q)$ approaches $H(p,q)$.
Accounting for the priors on parameters $theta$, the Shannon cross-entropy is obtained by multiplying the respective pdfs with the priors, i.e.,
$$H(p,q;lambda) = -sumlimits_x lambda(theta)p_theta(x) log (lambda(theta^')q_theta^'(x)).$$
In case of Rényi cross-entropy, if I multiply the pdfs by the priors , then the resulting expression does not reduce to $H(p,q;lambda)$ as $alpharightarrow1$. The solution suggested by heropup in response to my earlier question, i.e., multiplying by the factor $lambda(theta)$ before the log-sum, does not work here.
What am I missing? Any help will be greatly appreciated.
-RD
probability-distributions bayesian entropy
[Note: This is a continuation of my earlier question Priors in Shannon and Rényi entropies. Please see the earlier question for introduction to notations.]
I wanted to extend the prior $lambda(theta)$ to Rényi cross-entropy. There seem to be multiple expressions for Rényi cross-entropy e.g. here and here. As $alpharightarrow1$, these expressions reduce to Shannon cross entropy $$H(p,q) = -sumlimits_x p_theta(x) log q_theta^'(x).$$
I am interested in the Rényi cross-entropy expression that is obtained from the Sundaresan's divergence given by
$$I_alpha(p_theta, q_theta^') = fracalpha1-alpha logsumlimits_x p_theta(x) (q_theta^'(x))^alpha-1 + log sumlimits_x(q_theta^'(x))^alpha - frac11-alphalogBigg(sumlimits_x p_theta(x)^alphaBigg) $$
The last term is negative of Rényi entropy. Since Sundaresan's divergence reduces to Kullback-Leibler's as $alpharightarrow1$, the first two terms of $I_alpha(p_theta, q_theta^')$ should be equivalent to Shannon's cross-entropy. Therefore, collecting the first two terms, the Rényi cross-entropy apparently is
$$H_alpha(p,q) = fracalpha1-alpha logsumlimits_x p_theta(x) (q_theta^'(x))^alpha-1 + log sumlimits_x(q_theta^'(x))^alpha\=fracalpha1-alphalogsumlimits_x p_theta(x) Bigg(fracq_theta^'(x)Bigg)^alpha-1,$$
where $||q_theta^'(x)|| = (sumlimits_x(q_theta^'(x))^alpha)^1/alpha$. Through the application of L'Hopital rule, it can be shown that as $alpharightarrow1$, $H_alpha(p,q)$ approaches $H(p,q)$.
Accounting for the priors on parameters $theta$, the Shannon cross-entropy is obtained by multiplying the respective pdfs with the priors, i.e.,
$$H(p,q;lambda) = -sumlimits_x lambda(theta)p_theta(x) log (lambda(theta^')q_theta^'(x)).$$
In case of Rényi cross-entropy, if I multiply the pdfs by the priors , then the resulting expression does not reduce to $H(p,q;lambda)$ as $alpharightarrow1$. The solution suggested by heropup in response to my earlier question, i.e., multiplying by the factor $lambda(theta)$ before the log-sum, does not work here.
What am I missing? Any help will be greatly appreciated.
-RD
probability-distributions bayesian entropy
edited Jul 28 at 14:09
asked Jul 28 at 12:27
r2d2
235
235
add a comment |Â
add a comment |Â
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2865212%2fdefinition-of-r%25c3%25a9nyi-cross-entropy-with-a-prior%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password