If joint probabilities equal mean their distributions are equal?
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
Let $X$ and $Y$ be i.i.d. $N(0, 1)$, and let $S$ be a random sign (1 or -1, with equal probabilities) independent of $(X, Y)$.
beginalign*
P((SX,SY)∈B)&=P((X,Y)∈B,S=1)+P((−X,−Y)∈B,S=−1) \
&= P((X,Y)∈B)P(S=1)+P((−X,−Y)∈B)P(S=−1)\
&= 0.5*P((X,Y)∈B) + 0.5*P((−X,−Y)∈B)\
&= P((X,Y)∈B)
endalign*
How can we say that $(SX, SY)$ is multivariate normal?
probability-theory measure-theory probability-distributions
add a comment |Â
up vote
0
down vote
favorite
Let $X$ and $Y$ be i.i.d. $N(0, 1)$, and let $S$ be a random sign (1 or -1, with equal probabilities) independent of $(X, Y)$.
beginalign*
P((SX,SY)∈B)&=P((X,Y)∈B,S=1)+P((−X,−Y)∈B,S=−1) \
&= P((X,Y)∈B)P(S=1)+P((−X,−Y)∈B)P(S=−1)\
&= 0.5*P((X,Y)∈B) + 0.5*P((−X,−Y)∈B)\
&= P((X,Y)∈B)
endalign*
How can we say that $(SX, SY)$ is multivariate normal?
probability-theory measure-theory probability-distributions
The joint density function for $SX$ and $SY$ stays the same for each choice of $S$, so the joint density function is unchanged from the joint density of $X$ and $Y$.
– herb steinberg
Jul 21 at 21:41
@herbsteinberg If $P((SX, SY) in B) = P((X,Y)in B)$, does it mean joint pdf are necessarily equal?
– Shana
Jul 21 at 22:47
Yes, of course! By the very definition of joint distribution.
– Kavi Rama Murthy
Jul 21 at 23:37
@KaviRamaMurthy: I can see it intuitively. However, is it a general result or is it because the densities are continuous so that derivative of joint CDF gives joint PDF.
– Shana
Jul 21 at 23:56
The joint distribution of $(SX,SY)$ is that of $(X,Y)$, from which the result follows. That $X$ is a continuous random variable is irrelevant.
– Math1000
Jul 24 at 3:11
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Let $X$ and $Y$ be i.i.d. $N(0, 1)$, and let $S$ be a random sign (1 or -1, with equal probabilities) independent of $(X, Y)$.
beginalign*
P((SX,SY)∈B)&=P((X,Y)∈B,S=1)+P((−X,−Y)∈B,S=−1) \
&= P((X,Y)∈B)P(S=1)+P((−X,−Y)∈B)P(S=−1)\
&= 0.5*P((X,Y)∈B) + 0.5*P((−X,−Y)∈B)\
&= P((X,Y)∈B)
endalign*
How can we say that $(SX, SY)$ is multivariate normal?
probability-theory measure-theory probability-distributions
Let $X$ and $Y$ be i.i.d. $N(0, 1)$, and let $S$ be a random sign (1 or -1, with equal probabilities) independent of $(X, Y)$.
beginalign*
P((SX,SY)∈B)&=P((X,Y)∈B,S=1)+P((−X,−Y)∈B,S=−1) \
&= P((X,Y)∈B)P(S=1)+P((−X,−Y)∈B)P(S=−1)\
&= 0.5*P((X,Y)∈B) + 0.5*P((−X,−Y)∈B)\
&= P((X,Y)∈B)
endalign*
How can we say that $(SX, SY)$ is multivariate normal?
probability-theory measure-theory probability-distributions
asked Jul 21 at 21:23
Shana
408
408
The joint density function for $SX$ and $SY$ stays the same for each choice of $S$, so the joint density function is unchanged from the joint density of $X$ and $Y$.
– herb steinberg
Jul 21 at 21:41
@herbsteinberg If $P((SX, SY) in B) = P((X,Y)in B)$, does it mean joint pdf are necessarily equal?
– Shana
Jul 21 at 22:47
Yes, of course! By the very definition of joint distribution.
– Kavi Rama Murthy
Jul 21 at 23:37
@KaviRamaMurthy: I can see it intuitively. However, is it a general result or is it because the densities are continuous so that derivative of joint CDF gives joint PDF.
– Shana
Jul 21 at 23:56
The joint distribution of $(SX,SY)$ is that of $(X,Y)$, from which the result follows. That $X$ is a continuous random variable is irrelevant.
– Math1000
Jul 24 at 3:11
add a comment |Â
The joint density function for $SX$ and $SY$ stays the same for each choice of $S$, so the joint density function is unchanged from the joint density of $X$ and $Y$.
– herb steinberg
Jul 21 at 21:41
@herbsteinberg If $P((SX, SY) in B) = P((X,Y)in B)$, does it mean joint pdf are necessarily equal?
– Shana
Jul 21 at 22:47
Yes, of course! By the very definition of joint distribution.
– Kavi Rama Murthy
Jul 21 at 23:37
@KaviRamaMurthy: I can see it intuitively. However, is it a general result or is it because the densities are continuous so that derivative of joint CDF gives joint PDF.
– Shana
Jul 21 at 23:56
The joint distribution of $(SX,SY)$ is that of $(X,Y)$, from which the result follows. That $X$ is a continuous random variable is irrelevant.
– Math1000
Jul 24 at 3:11
The joint density function for $SX$ and $SY$ stays the same for each choice of $S$, so the joint density function is unchanged from the joint density of $X$ and $Y$.
– herb steinberg
Jul 21 at 21:41
The joint density function for $SX$ and $SY$ stays the same for each choice of $S$, so the joint density function is unchanged from the joint density of $X$ and $Y$.
– herb steinberg
Jul 21 at 21:41
@herbsteinberg If $P((SX, SY) in B) = P((X,Y)in B)$, does it mean joint pdf are necessarily equal?
– Shana
Jul 21 at 22:47
@herbsteinberg If $P((SX, SY) in B) = P((X,Y)in B)$, does it mean joint pdf are necessarily equal?
– Shana
Jul 21 at 22:47
Yes, of course! By the very definition of joint distribution.
– Kavi Rama Murthy
Jul 21 at 23:37
Yes, of course! By the very definition of joint distribution.
– Kavi Rama Murthy
Jul 21 at 23:37
@KaviRamaMurthy: I can see it intuitively. However, is it a general result or is it because the densities are continuous so that derivative of joint CDF gives joint PDF.
– Shana
Jul 21 at 23:56
@KaviRamaMurthy: I can see it intuitively. However, is it a general result or is it because the densities are continuous so that derivative of joint CDF gives joint PDF.
– Shana
Jul 21 at 23:56
The joint distribution of $(SX,SY)$ is that of $(X,Y)$, from which the result follows. That $X$ is a continuous random variable is irrelevant.
– Math1000
Jul 24 at 3:11
The joint distribution of $(SX,SY)$ is that of $(X,Y)$, from which the result follows. That $X$ is a continuous random variable is irrelevant.
– Math1000
Jul 24 at 3:11
add a comment |Â
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2858890%2fif-joint-probabilities-equal-mean-their-distributions-are-equal%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
The joint density function for $SX$ and $SY$ stays the same for each choice of $S$, so the joint density function is unchanged from the joint density of $X$ and $Y$.
– herb steinberg
Jul 21 at 21:41
@herbsteinberg If $P((SX, SY) in B) = P((X,Y)in B)$, does it mean joint pdf are necessarily equal?
– Shana
Jul 21 at 22:47
Yes, of course! By the very definition of joint distribution.
– Kavi Rama Murthy
Jul 21 at 23:37
@KaviRamaMurthy: I can see it intuitively. However, is it a general result or is it because the densities are continuous so that derivative of joint CDF gives joint PDF.
– Shana
Jul 21 at 23:56
The joint distribution of $(SX,SY)$ is that of $(X,Y)$, from which the result follows. That $X$ is a continuous random variable is irrelevant.
– Math1000
Jul 24 at 3:11