Entropy conditioned on a function of a r.v
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
I have problem in proving $H(Y mid X) leq H(Y mid f(X))$, where $f$ is a function of $X$. In the textbooks, they already proved $I(Y; X) geq I(Y;f(X))$, and $H(f(X)) leq H(X)$ but I can't relate those with the question problem.
Besides, there is one other question that I am concerned about.
If $H(Y mid X) = H(Y mid f(X))$, what is the conditions? Is that $p_Y(cdot mid X) = p_Y(cdot mid f(X))$?
We already have the case $H(Z) = H(W) rightarrow p_Z(cdot) = p_W(cdot)$, that is wrong. However I doubt it may be right for $H(Y mid X) = H(Y mid f(X)) rightarrow p_Y(cdot mid X) = p_Y(cdot mid f(X))$.
Can anyone help? Thanks.
probability statistics entropy
 |Â
show 1 more comment
up vote
1
down vote
favorite
I have problem in proving $H(Y mid X) leq H(Y mid f(X))$, where $f$ is a function of $X$. In the textbooks, they already proved $I(Y; X) geq I(Y;f(X))$, and $H(f(X)) leq H(X)$ but I can't relate those with the question problem.
Besides, there is one other question that I am concerned about.
If $H(Y mid X) = H(Y mid f(X))$, what is the conditions? Is that $p_Y(cdot mid X) = p_Y(cdot mid f(X))$?
We already have the case $H(Z) = H(W) rightarrow p_Z(cdot) = p_W(cdot)$, that is wrong. However I doubt it may be right for $H(Y mid X) = H(Y mid f(X)) rightarrow p_Y(cdot mid X) = p_Y(cdot mid f(X))$.
Can anyone help? Thanks.
probability statistics entropy
I can help you if you promise to never ever again use the incorrect notation $P(Z)$ and $P(W)$, $P(Y|X)$, $P(Y|f(X)$. You can only take probabilities of events, and the same for conditional probabilities. You can use a mass function $p_Z(z) =P[Z=z]$ for all $z$ in the set of possible outcomes of the random variable $Z$. You can also use conditional probabilities $P[Y=y|X=x]$.
– Michael
Jul 29 at 4:44
Some textbooks use different notations, so it causes confusion. However, I understand each notation means. $P(X)$ in my writings means $p_X(x)$. In this case, sorry for what cause troubles reading to you. Please rectify any mistakes in my notation, and help me prove it. So much thanks. I have revised the equations as you suggested.
– khahuras
Jul 29 at 5:37
2
Hint: it holds $I(X;Y)geq I(Y;f(X))$ and also $I(X;Y)=H(Y)-H(Y|X)$.
– Stelios
Jul 29 at 6:51
Thanks Stelios!
– khahuras
Jul 29 at 11:05
1
The Stelios comment is what I was going to give as a hint also. As for when $H(X|f(Y))=H(X|Y)$ holds, intuitively it is when $f(Y)$ tells you just as much "information" about $X$ as $Y$ tells you about $X$, so $I(X; Y)=I(X;f(Y))$. It holds under various cases, such as when $X$ and $Y$ are independent, or when $f$ is invertible (so knowledge of $f(Y)$ tells you $Y$).
– Michael
Aug 2 at 17:26
 |Â
show 1 more comment
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I have problem in proving $H(Y mid X) leq H(Y mid f(X))$, where $f$ is a function of $X$. In the textbooks, they already proved $I(Y; X) geq I(Y;f(X))$, and $H(f(X)) leq H(X)$ but I can't relate those with the question problem.
Besides, there is one other question that I am concerned about.
If $H(Y mid X) = H(Y mid f(X))$, what is the conditions? Is that $p_Y(cdot mid X) = p_Y(cdot mid f(X))$?
We already have the case $H(Z) = H(W) rightarrow p_Z(cdot) = p_W(cdot)$, that is wrong. However I doubt it may be right for $H(Y mid X) = H(Y mid f(X)) rightarrow p_Y(cdot mid X) = p_Y(cdot mid f(X))$.
Can anyone help? Thanks.
probability statistics entropy
I have problem in proving $H(Y mid X) leq H(Y mid f(X))$, where $f$ is a function of $X$. In the textbooks, they already proved $I(Y; X) geq I(Y;f(X))$, and $H(f(X)) leq H(X)$ but I can't relate those with the question problem.
Besides, there is one other question that I am concerned about.
If $H(Y mid X) = H(Y mid f(X))$, what is the conditions? Is that $p_Y(cdot mid X) = p_Y(cdot mid f(X))$?
We already have the case $H(Z) = H(W) rightarrow p_Z(cdot) = p_W(cdot)$, that is wrong. However I doubt it may be right for $H(Y mid X) = H(Y mid f(X)) rightarrow p_Y(cdot mid X) = p_Y(cdot mid f(X))$.
Can anyone help? Thanks.
probability statistics entropy
edited Jul 29 at 5:41
asked Jul 29 at 3:17


khahuras
184
184
I can help you if you promise to never ever again use the incorrect notation $P(Z)$ and $P(W)$, $P(Y|X)$, $P(Y|f(X)$. You can only take probabilities of events, and the same for conditional probabilities. You can use a mass function $p_Z(z) =P[Z=z]$ for all $z$ in the set of possible outcomes of the random variable $Z$. You can also use conditional probabilities $P[Y=y|X=x]$.
– Michael
Jul 29 at 4:44
Some textbooks use different notations, so it causes confusion. However, I understand each notation means. $P(X)$ in my writings means $p_X(x)$. In this case, sorry for what cause troubles reading to you. Please rectify any mistakes in my notation, and help me prove it. So much thanks. I have revised the equations as you suggested.
– khahuras
Jul 29 at 5:37
2
Hint: it holds $I(X;Y)geq I(Y;f(X))$ and also $I(X;Y)=H(Y)-H(Y|X)$.
– Stelios
Jul 29 at 6:51
Thanks Stelios!
– khahuras
Jul 29 at 11:05
1
The Stelios comment is what I was going to give as a hint also. As for when $H(X|f(Y))=H(X|Y)$ holds, intuitively it is when $f(Y)$ tells you just as much "information" about $X$ as $Y$ tells you about $X$, so $I(X; Y)=I(X;f(Y))$. It holds under various cases, such as when $X$ and $Y$ are independent, or when $f$ is invertible (so knowledge of $f(Y)$ tells you $Y$).
– Michael
Aug 2 at 17:26
 |Â
show 1 more comment
I can help you if you promise to never ever again use the incorrect notation $P(Z)$ and $P(W)$, $P(Y|X)$, $P(Y|f(X)$. You can only take probabilities of events, and the same for conditional probabilities. You can use a mass function $p_Z(z) =P[Z=z]$ for all $z$ in the set of possible outcomes of the random variable $Z$. You can also use conditional probabilities $P[Y=y|X=x]$.
– Michael
Jul 29 at 4:44
Some textbooks use different notations, so it causes confusion. However, I understand each notation means. $P(X)$ in my writings means $p_X(x)$. In this case, sorry for what cause troubles reading to you. Please rectify any mistakes in my notation, and help me prove it. So much thanks. I have revised the equations as you suggested.
– khahuras
Jul 29 at 5:37
2
Hint: it holds $I(X;Y)geq I(Y;f(X))$ and also $I(X;Y)=H(Y)-H(Y|X)$.
– Stelios
Jul 29 at 6:51
Thanks Stelios!
– khahuras
Jul 29 at 11:05
1
The Stelios comment is what I was going to give as a hint also. As for when $H(X|f(Y))=H(X|Y)$ holds, intuitively it is when $f(Y)$ tells you just as much "information" about $X$ as $Y$ tells you about $X$, so $I(X; Y)=I(X;f(Y))$. It holds under various cases, such as when $X$ and $Y$ are independent, or when $f$ is invertible (so knowledge of $f(Y)$ tells you $Y$).
– Michael
Aug 2 at 17:26
I can help you if you promise to never ever again use the incorrect notation $P(Z)$ and $P(W)$, $P(Y|X)$, $P(Y|f(X)$. You can only take probabilities of events, and the same for conditional probabilities. You can use a mass function $p_Z(z) =P[Z=z]$ for all $z$ in the set of possible outcomes of the random variable $Z$. You can also use conditional probabilities $P[Y=y|X=x]$.
– Michael
Jul 29 at 4:44
I can help you if you promise to never ever again use the incorrect notation $P(Z)$ and $P(W)$, $P(Y|X)$, $P(Y|f(X)$. You can only take probabilities of events, and the same for conditional probabilities. You can use a mass function $p_Z(z) =P[Z=z]$ for all $z$ in the set of possible outcomes of the random variable $Z$. You can also use conditional probabilities $P[Y=y|X=x]$.
– Michael
Jul 29 at 4:44
Some textbooks use different notations, so it causes confusion. However, I understand each notation means. $P(X)$ in my writings means $p_X(x)$. In this case, sorry for what cause troubles reading to you. Please rectify any mistakes in my notation, and help me prove it. So much thanks. I have revised the equations as you suggested.
– khahuras
Jul 29 at 5:37
Some textbooks use different notations, so it causes confusion. However, I understand each notation means. $P(X)$ in my writings means $p_X(x)$. In this case, sorry for what cause troubles reading to you. Please rectify any mistakes in my notation, and help me prove it. So much thanks. I have revised the equations as you suggested.
– khahuras
Jul 29 at 5:37
2
2
Hint: it holds $I(X;Y)geq I(Y;f(X))$ and also $I(X;Y)=H(Y)-H(Y|X)$.
– Stelios
Jul 29 at 6:51
Hint: it holds $I(X;Y)geq I(Y;f(X))$ and also $I(X;Y)=H(Y)-H(Y|X)$.
– Stelios
Jul 29 at 6:51
Thanks Stelios!
– khahuras
Jul 29 at 11:05
Thanks Stelios!
– khahuras
Jul 29 at 11:05
1
1
The Stelios comment is what I was going to give as a hint also. As for when $H(X|f(Y))=H(X|Y)$ holds, intuitively it is when $f(Y)$ tells you just as much "information" about $X$ as $Y$ tells you about $X$, so $I(X; Y)=I(X;f(Y))$. It holds under various cases, such as when $X$ and $Y$ are independent, or when $f$ is invertible (so knowledge of $f(Y)$ tells you $Y$).
– Michael
Aug 2 at 17:26
The Stelios comment is what I was going to give as a hint also. As for when $H(X|f(Y))=H(X|Y)$ holds, intuitively it is when $f(Y)$ tells you just as much "information" about $X$ as $Y$ tells you about $X$, so $I(X; Y)=I(X;f(Y))$. It holds under various cases, such as when $X$ and $Y$ are independent, or when $f$ is invertible (so knowledge of $f(Y)$ tells you $Y$).
– Michael
Aug 2 at 17:26
 |Â
show 1 more comment
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2865750%2fentropy-conditioned-on-a-function-of-a-r-v%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
I can help you if you promise to never ever again use the incorrect notation $P(Z)$ and $P(W)$, $P(Y|X)$, $P(Y|f(X)$. You can only take probabilities of events, and the same for conditional probabilities. You can use a mass function $p_Z(z) =P[Z=z]$ for all $z$ in the set of possible outcomes of the random variable $Z$. You can also use conditional probabilities $P[Y=y|X=x]$.
– Michael
Jul 29 at 4:44
Some textbooks use different notations, so it causes confusion. However, I understand each notation means. $P(X)$ in my writings means $p_X(x)$. In this case, sorry for what cause troubles reading to you. Please rectify any mistakes in my notation, and help me prove it. So much thanks. I have revised the equations as you suggested.
– khahuras
Jul 29 at 5:37
2
Hint: it holds $I(X;Y)geq I(Y;f(X))$ and also $I(X;Y)=H(Y)-H(Y|X)$.
– Stelios
Jul 29 at 6:51
Thanks Stelios!
– khahuras
Jul 29 at 11:05
1
The Stelios comment is what I was going to give as a hint also. As for when $H(X|f(Y))=H(X|Y)$ holds, intuitively it is when $f(Y)$ tells you just as much "information" about $X$ as $Y$ tells you about $X$, so $I(X; Y)=I(X;f(Y))$. It holds under various cases, such as when $X$ and $Y$ are independent, or when $f$ is invertible (so knowledge of $f(Y)$ tells you $Y$).
– Michael
Aug 2 at 17:26