Entropy of a unimodal continuous probability distribution
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
Among unimodal continuous probability distributions supported on the positive reals and whose mean and mode coincide, which one has the maximal entropy ?
probability-distributions
add a comment |Â
up vote
1
down vote
favorite
Among unimodal continuous probability distributions supported on the positive reals and whose mean and mode coincide, which one has the maximal entropy ?
probability-distributions
1
+1, nice question. Do you mean the differential entropy? (As there are different ways to define entropy for continuous distributions; see here.)
– joriki
Jul 21 at 13:13
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
Among unimodal continuous probability distributions supported on the positive reals and whose mean and mode coincide, which one has the maximal entropy ?
probability-distributions
Among unimodal continuous probability distributions supported on the positive reals and whose mean and mode coincide, which one has the maximal entropy ?
probability-distributions
asked Jul 21 at 12:44
Sylvain Julien
869818
869818
1
+1, nice question. Do you mean the differential entropy? (As there are different ways to define entropy for continuous distributions; see here.)
– joriki
Jul 21 at 13:13
add a comment |Â
1
+1, nice question. Do you mean the differential entropy? (As there are different ways to define entropy for continuous distributions; see here.)
– joriki
Jul 21 at 13:13
1
1
+1, nice question. Do you mean the differential entropy? (As there are different ways to define entropy for continuous distributions; see here.)
– joriki
Jul 21 at 13:13
+1, nice question. Do you mean the differential entropy? (As there are different ways to define entropy for continuous distributions; see here.)
– joriki
Jul 21 at 13:13
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
1
down vote
accepted
I'll assume that you're referring to the differential entropy of the distribution.
There is no such extremal unimodal distribution.
There are three constraints on the distribution function $f(x)$. Two can be expressed in integral form:
$$int_0^infty f(x)=1$$
and
$$int_0^infty xf(x)=mu;.$$
Ignoring for now the constraint that $f$ is unimodal with mode $x_0$, we get the following Lagrangian:
$$
L[f]=int_0^inftyleft(f(x)log f(x)+alpha f(x)+beta xf(x)right)mathrm dx;.
$$
Varying with respect to $f$ yields
$$
log f(x)+1+alpha+beta x=0;.
$$
Thus, $f$ for this simplified problem would be an exponential function. I'm not sure how to prove this formally, but it seems clear to me that if we add the constraint that $f$ is unimodal with mode $x_0$, the result must be two exponentials decaying away from $x_0$ on either side. But then there's no extremal value of the parameters, because for any given solution you can make the decay slightly slower on both sides while maintaining the mode and the mean, thus slightly increasing the entropy. The limit of this construction is a distribution that's no longer unimodal, but is constant up to the “mode†and then decays exponentially.
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
I'll assume that you're referring to the differential entropy of the distribution.
There is no such extremal unimodal distribution.
There are three constraints on the distribution function $f(x)$. Two can be expressed in integral form:
$$int_0^infty f(x)=1$$
and
$$int_0^infty xf(x)=mu;.$$
Ignoring for now the constraint that $f$ is unimodal with mode $x_0$, we get the following Lagrangian:
$$
L[f]=int_0^inftyleft(f(x)log f(x)+alpha f(x)+beta xf(x)right)mathrm dx;.
$$
Varying with respect to $f$ yields
$$
log f(x)+1+alpha+beta x=0;.
$$
Thus, $f$ for this simplified problem would be an exponential function. I'm not sure how to prove this formally, but it seems clear to me that if we add the constraint that $f$ is unimodal with mode $x_0$, the result must be two exponentials decaying away from $x_0$ on either side. But then there's no extremal value of the parameters, because for any given solution you can make the decay slightly slower on both sides while maintaining the mode and the mean, thus slightly increasing the entropy. The limit of this construction is a distribution that's no longer unimodal, but is constant up to the “mode†and then decays exponentially.
add a comment |Â
up vote
1
down vote
accepted
I'll assume that you're referring to the differential entropy of the distribution.
There is no such extremal unimodal distribution.
There are three constraints on the distribution function $f(x)$. Two can be expressed in integral form:
$$int_0^infty f(x)=1$$
and
$$int_0^infty xf(x)=mu;.$$
Ignoring for now the constraint that $f$ is unimodal with mode $x_0$, we get the following Lagrangian:
$$
L[f]=int_0^inftyleft(f(x)log f(x)+alpha f(x)+beta xf(x)right)mathrm dx;.
$$
Varying with respect to $f$ yields
$$
log f(x)+1+alpha+beta x=0;.
$$
Thus, $f$ for this simplified problem would be an exponential function. I'm not sure how to prove this formally, but it seems clear to me that if we add the constraint that $f$ is unimodal with mode $x_0$, the result must be two exponentials decaying away from $x_0$ on either side. But then there's no extremal value of the parameters, because for any given solution you can make the decay slightly slower on both sides while maintaining the mode and the mean, thus slightly increasing the entropy. The limit of this construction is a distribution that's no longer unimodal, but is constant up to the “mode†and then decays exponentially.
add a comment |Â
up vote
1
down vote
accepted
up vote
1
down vote
accepted
I'll assume that you're referring to the differential entropy of the distribution.
There is no such extremal unimodal distribution.
There are three constraints on the distribution function $f(x)$. Two can be expressed in integral form:
$$int_0^infty f(x)=1$$
and
$$int_0^infty xf(x)=mu;.$$
Ignoring for now the constraint that $f$ is unimodal with mode $x_0$, we get the following Lagrangian:
$$
L[f]=int_0^inftyleft(f(x)log f(x)+alpha f(x)+beta xf(x)right)mathrm dx;.
$$
Varying with respect to $f$ yields
$$
log f(x)+1+alpha+beta x=0;.
$$
Thus, $f$ for this simplified problem would be an exponential function. I'm not sure how to prove this formally, but it seems clear to me that if we add the constraint that $f$ is unimodal with mode $x_0$, the result must be two exponentials decaying away from $x_0$ on either side. But then there's no extremal value of the parameters, because for any given solution you can make the decay slightly slower on both sides while maintaining the mode and the mean, thus slightly increasing the entropy. The limit of this construction is a distribution that's no longer unimodal, but is constant up to the “mode†and then decays exponentially.
I'll assume that you're referring to the differential entropy of the distribution.
There is no such extremal unimodal distribution.
There are three constraints on the distribution function $f(x)$. Two can be expressed in integral form:
$$int_0^infty f(x)=1$$
and
$$int_0^infty xf(x)=mu;.$$
Ignoring for now the constraint that $f$ is unimodal with mode $x_0$, we get the following Lagrangian:
$$
L[f]=int_0^inftyleft(f(x)log f(x)+alpha f(x)+beta xf(x)right)mathrm dx;.
$$
Varying with respect to $f$ yields
$$
log f(x)+1+alpha+beta x=0;.
$$
Thus, $f$ for this simplified problem would be an exponential function. I'm not sure how to prove this formally, but it seems clear to me that if we add the constraint that $f$ is unimodal with mode $x_0$, the result must be two exponentials decaying away from $x_0$ on either side. But then there's no extremal value of the parameters, because for any given solution you can make the decay slightly slower on both sides while maintaining the mode and the mean, thus slightly increasing the entropy. The limit of this construction is a distribution that's no longer unimodal, but is constant up to the “mode†and then decays exponentially.
edited Jul 21 at 23:48
answered Jul 21 at 15:35
joriki
164k10180328
164k10180328
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2858467%2fentropy-of-a-unimodal-continuous-probability-distribution%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
+1, nice question. Do you mean the differential entropy? (As there are different ways to define entropy for continuous distributions; see here.)
– joriki
Jul 21 at 13:13