Distributing expectation over a quadratic function
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
I saw this proof in MIT Probability Courseware :
I understand the linearity of expectation and went through the proof of it as well. But how is the Expectation distributed over a quadratic function here in the second step of the proof ?
To clarify, I want to understand what allows us to distribute
beginalignE(X^2 + a)&=E(X^2) + E(a)endalign
This is not linear in X, so linearity of expectation shouldn't hold ?!
Further clarification :
I am also told in the course (Slide 2), to not assume :
E[g(X)] = g(E[X]) to be true in general.
If I could do change of variables like suggested by some answers, the above can always be made to be true ?
linear-algebra probability statistics random-variables expectation
add a comment |Â
up vote
0
down vote
favorite
I saw this proof in MIT Probability Courseware :
I understand the linearity of expectation and went through the proof of it as well. But how is the Expectation distributed over a quadratic function here in the second step of the proof ?
To clarify, I want to understand what allows us to distribute
beginalignE(X^2 + a)&=E(X^2) + E(a)endalign
This is not linear in X, so linearity of expectation shouldn't hold ?!
Further clarification :
I am also told in the course (Slide 2), to not assume :
E[g(X)] = g(E[X]) to be true in general.
If I could do change of variables like suggested by some answers, the above can always be made to be true ?
linear-algebra probability statistics random-variables expectation
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I saw this proof in MIT Probability Courseware :
I understand the linearity of expectation and went through the proof of it as well. But how is the Expectation distributed over a quadratic function here in the second step of the proof ?
To clarify, I want to understand what allows us to distribute
beginalignE(X^2 + a)&=E(X^2) + E(a)endalign
This is not linear in X, so linearity of expectation shouldn't hold ?!
Further clarification :
I am also told in the course (Slide 2), to not assume :
E[g(X)] = g(E[X]) to be true in general.
If I could do change of variables like suggested by some answers, the above can always be made to be true ?
linear-algebra probability statistics random-variables expectation
I saw this proof in MIT Probability Courseware :
I understand the linearity of expectation and went through the proof of it as well. But how is the Expectation distributed over a quadratic function here in the second step of the proof ?
To clarify, I want to understand what allows us to distribute
beginalignE(X^2 + a)&=E(X^2) + E(a)endalign
This is not linear in X, so linearity of expectation shouldn't hold ?!
Further clarification :
I am also told in the course (Slide 2), to not assume :
E[g(X)] = g(E[X]) to be true in general.
If I could do change of variables like suggested by some answers, the above can always be made to be true ?
linear-algebra probability statistics random-variables expectation
edited Jul 23 at 4:07
asked Jul 23 at 3:51
Amit Tomar
1982313
1982313
add a comment |Â
add a comment |Â
3 Answers
3
active
oldest
votes
up vote
0
down vote
The expected value of the sum of random variables is the sum of their expected values. This is true for any two random variables for which expected values exist, not just for a single variable and linear functions of itself.
That is, in general, if $Y$ and $Z$ are random variables
and if $E(Y)$ and $E(Z)$ both exist,
$$ E (Y + Z) = E(Y) + E(Z). $$
But if $X$ is a random variable, then $Y = X^2$ is a random variable
and so is $Z = -2mu X + mu^2.$
Moreover, $Y + Z = X^2 - 2mu X + mu^2.$
Therefore
beginalign
E(X^2 - 2mu X + mu^2) &= E(Y + Z)\
&= E(Y) + E(Z)\
&= E(X^2) + E(- 2mu X + mu^2).
endalign
I think you can work out the rest of it.
add a comment |Â
up vote
0
down vote
Remember for a constant, $E(kX)=kE(X)$, $-2mu$ and $mu^2$ ara constants.
beginalignE(X^2-2mu X+mu^2)&=E(X^2)-E(2mu X)+E(mu^2)\&=E(X^2)-2mu E( X)+mu^2 E(1)\&=E(X^2)-2mu E( X)+mu^2 endalign
Edit:
Let $Y=X^2$, then $Y$ is a random variable. hence the problem becomes $E(Y-2mu X+mu^2)$.
Edit $2$:
If you let $Y=g(X)$, $E(g(X))=E(Y)$, the $g$ doesn't get out from the expectation in general, in the context of $g(X)=X^2$, note that the quadratic stays inside the expectation term. In general $E(X^2) ne E(X)^2$.
I want to understand what allows us to distribute over the square of X, linearity of expectation allows to distribute over only linear functions of X, right ?
â Amit Tomar
Jul 23 at 3:56
Let $Y=X^2$, then $Y$ is a random variable. hence the problem becomes $E(Y-2mu X+mu^2)$. Page $6$, the point on linearity in the bottom table might be of interest to you.
â Siong Thye Goh
Jul 23 at 4:01
add a comment |Â
up vote
0
down vote
I think your question has to do with this step
$$E((X-mu)^2) = E(X^2 -2mu X + mu^2) $$
$$ = E(X^2) - 2mu E(X) + mu^2$$
$$E(X^2 -2mu X + mu^2) = E(X^2) -E(2mu X) +E(mu^2)$$
$$ E(X^2) -2mu E(X) + E(mu^2) $$
you should note that $mu = E(X)$ and is simply a constant so we can pull it out.
Then we have, if substitute
$$E(X)^2 -2 mu mu + mu^2 = E(X^2) - 2 mu^2+ mu^2 = E(X^2) - mu^2$$
It may also be useful to remember what the expectation is
For discrete variables it is
$$ E(X) = sum_i x_i p(x_i) $$
for continuous random variables we have
$$ E(X) = int_-infty^infty x f(x) dx $$
so your question is about linearity
$ mu $ is a constant consider why this works with summations and integrals
add a comment |Â
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
The expected value of the sum of random variables is the sum of their expected values. This is true for any two random variables for which expected values exist, not just for a single variable and linear functions of itself.
That is, in general, if $Y$ and $Z$ are random variables
and if $E(Y)$ and $E(Z)$ both exist,
$$ E (Y + Z) = E(Y) + E(Z). $$
But if $X$ is a random variable, then $Y = X^2$ is a random variable
and so is $Z = -2mu X + mu^2.$
Moreover, $Y + Z = X^2 - 2mu X + mu^2.$
Therefore
beginalign
E(X^2 - 2mu X + mu^2) &= E(Y + Z)\
&= E(Y) + E(Z)\
&= E(X^2) + E(- 2mu X + mu^2).
endalign
I think you can work out the rest of it.
add a comment |Â
up vote
0
down vote
The expected value of the sum of random variables is the sum of their expected values. This is true for any two random variables for which expected values exist, not just for a single variable and linear functions of itself.
That is, in general, if $Y$ and $Z$ are random variables
and if $E(Y)$ and $E(Z)$ both exist,
$$ E (Y + Z) = E(Y) + E(Z). $$
But if $X$ is a random variable, then $Y = X^2$ is a random variable
and so is $Z = -2mu X + mu^2.$
Moreover, $Y + Z = X^2 - 2mu X + mu^2.$
Therefore
beginalign
E(X^2 - 2mu X + mu^2) &= E(Y + Z)\
&= E(Y) + E(Z)\
&= E(X^2) + E(- 2mu X + mu^2).
endalign
I think you can work out the rest of it.
add a comment |Â
up vote
0
down vote
up vote
0
down vote
The expected value of the sum of random variables is the sum of their expected values. This is true for any two random variables for which expected values exist, not just for a single variable and linear functions of itself.
That is, in general, if $Y$ and $Z$ are random variables
and if $E(Y)$ and $E(Z)$ both exist,
$$ E (Y + Z) = E(Y) + E(Z). $$
But if $X$ is a random variable, then $Y = X^2$ is a random variable
and so is $Z = -2mu X + mu^2.$
Moreover, $Y + Z = X^2 - 2mu X + mu^2.$
Therefore
beginalign
E(X^2 - 2mu X + mu^2) &= E(Y + Z)\
&= E(Y) + E(Z)\
&= E(X^2) + E(- 2mu X + mu^2).
endalign
I think you can work out the rest of it.
The expected value of the sum of random variables is the sum of their expected values. This is true for any two random variables for which expected values exist, not just for a single variable and linear functions of itself.
That is, in general, if $Y$ and $Z$ are random variables
and if $E(Y)$ and $E(Z)$ both exist,
$$ E (Y + Z) = E(Y) + E(Z). $$
But if $X$ is a random variable, then $Y = X^2$ is a random variable
and so is $Z = -2mu X + mu^2.$
Moreover, $Y + Z = X^2 - 2mu X + mu^2.$
Therefore
beginalign
E(X^2 - 2mu X + mu^2) &= E(Y + Z)\
&= E(Y) + E(Z)\
&= E(X^2) + E(- 2mu X + mu^2).
endalign
I think you can work out the rest of it.
answered Jul 23 at 4:02
David K
48.2k340107
48.2k340107
add a comment |Â
add a comment |Â
up vote
0
down vote
Remember for a constant, $E(kX)=kE(X)$, $-2mu$ and $mu^2$ ara constants.
beginalignE(X^2-2mu X+mu^2)&=E(X^2)-E(2mu X)+E(mu^2)\&=E(X^2)-2mu E( X)+mu^2 E(1)\&=E(X^2)-2mu E( X)+mu^2 endalign
Edit:
Let $Y=X^2$, then $Y$ is a random variable. hence the problem becomes $E(Y-2mu X+mu^2)$.
Edit $2$:
If you let $Y=g(X)$, $E(g(X))=E(Y)$, the $g$ doesn't get out from the expectation in general, in the context of $g(X)=X^2$, note that the quadratic stays inside the expectation term. In general $E(X^2) ne E(X)^2$.
I want to understand what allows us to distribute over the square of X, linearity of expectation allows to distribute over only linear functions of X, right ?
â Amit Tomar
Jul 23 at 3:56
Let $Y=X^2$, then $Y$ is a random variable. hence the problem becomes $E(Y-2mu X+mu^2)$. Page $6$, the point on linearity in the bottom table might be of interest to you.
â Siong Thye Goh
Jul 23 at 4:01
add a comment |Â
up vote
0
down vote
Remember for a constant, $E(kX)=kE(X)$, $-2mu$ and $mu^2$ ara constants.
beginalignE(X^2-2mu X+mu^2)&=E(X^2)-E(2mu X)+E(mu^2)\&=E(X^2)-2mu E( X)+mu^2 E(1)\&=E(X^2)-2mu E( X)+mu^2 endalign
Edit:
Let $Y=X^2$, then $Y$ is a random variable. hence the problem becomes $E(Y-2mu X+mu^2)$.
Edit $2$:
If you let $Y=g(X)$, $E(g(X))=E(Y)$, the $g$ doesn't get out from the expectation in general, in the context of $g(X)=X^2$, note that the quadratic stays inside the expectation term. In general $E(X^2) ne E(X)^2$.
I want to understand what allows us to distribute over the square of X, linearity of expectation allows to distribute over only linear functions of X, right ?
â Amit Tomar
Jul 23 at 3:56
Let $Y=X^2$, then $Y$ is a random variable. hence the problem becomes $E(Y-2mu X+mu^2)$. Page $6$, the point on linearity in the bottom table might be of interest to you.
â Siong Thye Goh
Jul 23 at 4:01
add a comment |Â
up vote
0
down vote
up vote
0
down vote
Remember for a constant, $E(kX)=kE(X)$, $-2mu$ and $mu^2$ ara constants.
beginalignE(X^2-2mu X+mu^2)&=E(X^2)-E(2mu X)+E(mu^2)\&=E(X^2)-2mu E( X)+mu^2 E(1)\&=E(X^2)-2mu E( X)+mu^2 endalign
Edit:
Let $Y=X^2$, then $Y$ is a random variable. hence the problem becomes $E(Y-2mu X+mu^2)$.
Edit $2$:
If you let $Y=g(X)$, $E(g(X))=E(Y)$, the $g$ doesn't get out from the expectation in general, in the context of $g(X)=X^2$, note that the quadratic stays inside the expectation term. In general $E(X^2) ne E(X)^2$.
Remember for a constant, $E(kX)=kE(X)$, $-2mu$ and $mu^2$ ara constants.
beginalignE(X^2-2mu X+mu^2)&=E(X^2)-E(2mu X)+E(mu^2)\&=E(X^2)-2mu E( X)+mu^2 E(1)\&=E(X^2)-2mu E( X)+mu^2 endalign
Edit:
Let $Y=X^2$, then $Y$ is a random variable. hence the problem becomes $E(Y-2mu X+mu^2)$.
Edit $2$:
If you let $Y=g(X)$, $E(g(X))=E(Y)$, the $g$ doesn't get out from the expectation in general, in the context of $g(X)=X^2$, note that the quadratic stays inside the expectation term. In general $E(X^2) ne E(X)^2$.
edited Jul 23 at 4:25
answered Jul 23 at 3:54
Siong Thye Goh
77.5k134795
77.5k134795
I want to understand what allows us to distribute over the square of X, linearity of expectation allows to distribute over only linear functions of X, right ?
â Amit Tomar
Jul 23 at 3:56
Let $Y=X^2$, then $Y$ is a random variable. hence the problem becomes $E(Y-2mu X+mu^2)$. Page $6$, the point on linearity in the bottom table might be of interest to you.
â Siong Thye Goh
Jul 23 at 4:01
add a comment |Â
I want to understand what allows us to distribute over the square of X, linearity of expectation allows to distribute over only linear functions of X, right ?
â Amit Tomar
Jul 23 at 3:56
Let $Y=X^2$, then $Y$ is a random variable. hence the problem becomes $E(Y-2mu X+mu^2)$. Page $6$, the point on linearity in the bottom table might be of interest to you.
â Siong Thye Goh
Jul 23 at 4:01
I want to understand what allows us to distribute over the square of X, linearity of expectation allows to distribute over only linear functions of X, right ?
â Amit Tomar
Jul 23 at 3:56
I want to understand what allows us to distribute over the square of X, linearity of expectation allows to distribute over only linear functions of X, right ?
â Amit Tomar
Jul 23 at 3:56
Let $Y=X^2$, then $Y$ is a random variable. hence the problem becomes $E(Y-2mu X+mu^2)$. Page $6$, the point on linearity in the bottom table might be of interest to you.
â Siong Thye Goh
Jul 23 at 4:01
Let $Y=X^2$, then $Y$ is a random variable. hence the problem becomes $E(Y-2mu X+mu^2)$. Page $6$, the point on linearity in the bottom table might be of interest to you.
â Siong Thye Goh
Jul 23 at 4:01
add a comment |Â
up vote
0
down vote
I think your question has to do with this step
$$E((X-mu)^2) = E(X^2 -2mu X + mu^2) $$
$$ = E(X^2) - 2mu E(X) + mu^2$$
$$E(X^2 -2mu X + mu^2) = E(X^2) -E(2mu X) +E(mu^2)$$
$$ E(X^2) -2mu E(X) + E(mu^2) $$
you should note that $mu = E(X)$ and is simply a constant so we can pull it out.
Then we have, if substitute
$$E(X)^2 -2 mu mu + mu^2 = E(X^2) - 2 mu^2+ mu^2 = E(X^2) - mu^2$$
It may also be useful to remember what the expectation is
For discrete variables it is
$$ E(X) = sum_i x_i p(x_i) $$
for continuous random variables we have
$$ E(X) = int_-infty^infty x f(x) dx $$
so your question is about linearity
$ mu $ is a constant consider why this works with summations and integrals
add a comment |Â
up vote
0
down vote
I think your question has to do with this step
$$E((X-mu)^2) = E(X^2 -2mu X + mu^2) $$
$$ = E(X^2) - 2mu E(X) + mu^2$$
$$E(X^2 -2mu X + mu^2) = E(X^2) -E(2mu X) +E(mu^2)$$
$$ E(X^2) -2mu E(X) + E(mu^2) $$
you should note that $mu = E(X)$ and is simply a constant so we can pull it out.
Then we have, if substitute
$$E(X)^2 -2 mu mu + mu^2 = E(X^2) - 2 mu^2+ mu^2 = E(X^2) - mu^2$$
It may also be useful to remember what the expectation is
For discrete variables it is
$$ E(X) = sum_i x_i p(x_i) $$
for continuous random variables we have
$$ E(X) = int_-infty^infty x f(x) dx $$
so your question is about linearity
$ mu $ is a constant consider why this works with summations and integrals
add a comment |Â
up vote
0
down vote
up vote
0
down vote
I think your question has to do with this step
$$E((X-mu)^2) = E(X^2 -2mu X + mu^2) $$
$$ = E(X^2) - 2mu E(X) + mu^2$$
$$E(X^2 -2mu X + mu^2) = E(X^2) -E(2mu X) +E(mu^2)$$
$$ E(X^2) -2mu E(X) + E(mu^2) $$
you should note that $mu = E(X)$ and is simply a constant so we can pull it out.
Then we have, if substitute
$$E(X)^2 -2 mu mu + mu^2 = E(X^2) - 2 mu^2+ mu^2 = E(X^2) - mu^2$$
It may also be useful to remember what the expectation is
For discrete variables it is
$$ E(X) = sum_i x_i p(x_i) $$
for continuous random variables we have
$$ E(X) = int_-infty^infty x f(x) dx $$
so your question is about linearity
$ mu $ is a constant consider why this works with summations and integrals
I think your question has to do with this step
$$E((X-mu)^2) = E(X^2 -2mu X + mu^2) $$
$$ = E(X^2) - 2mu E(X) + mu^2$$
$$E(X^2 -2mu X + mu^2) = E(X^2) -E(2mu X) +E(mu^2)$$
$$ E(X^2) -2mu E(X) + E(mu^2) $$
you should note that $mu = E(X)$ and is simply a constant so we can pull it out.
Then we have, if substitute
$$E(X)^2 -2 mu mu + mu^2 = E(X^2) - 2 mu^2+ mu^2 = E(X^2) - mu^2$$
It may also be useful to remember what the expectation is
For discrete variables it is
$$ E(X) = sum_i x_i p(x_i) $$
for continuous random variables we have
$$ E(X) = int_-infty^infty x f(x) dx $$
so your question is about linearity
$ mu $ is a constant consider why this works with summations and integrals
edited Jul 23 at 4:59
answered Jul 23 at 4:07
RHowe
1,010815
1,010815
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2859993%2fdistributing-expectation-over-a-quadratic-function%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password