Expected value and distributions
Clash Royale CLAN TAG#URR8PPP
up vote
2
down vote
favorite
I'm reading a book at the moment and it has a theorem that states this:
if $X sim N(mu,sigma^2)$ then
$E(X) = mu$
My question is, when would it not equal $mu$
I thought the expected value is the mean of any distribution, not just the normal.
Is this rule onle the case for the normal distribution, all symmetric distributions or all distributions?
Thanks
probability-distributions random-variables expectation
add a comment |Â
up vote
2
down vote
favorite
I'm reading a book at the moment and it has a theorem that states this:
if $X sim N(mu,sigma^2)$ then
$E(X) = mu$
My question is, when would it not equal $mu$
I thought the expected value is the mean of any distribution, not just the normal.
Is this rule onle the case for the normal distribution, all symmetric distributions or all distributions?
Thanks
probability-distributions random-variables expectation
add a comment |Â
up vote
2
down vote
favorite
up vote
2
down vote
favorite
I'm reading a book at the moment and it has a theorem that states this:
if $X sim N(mu,sigma^2)$ then
$E(X) = mu$
My question is, when would it not equal $mu$
I thought the expected value is the mean of any distribution, not just the normal.
Is this rule onle the case for the normal distribution, all symmetric distributions or all distributions?
Thanks
probability-distributions random-variables expectation
I'm reading a book at the moment and it has a theorem that states this:
if $X sim N(mu,sigma^2)$ then
$E(X) = mu$
My question is, when would it not equal $mu$
I thought the expected value is the mean of any distribution, not just the normal.
Is this rule onle the case for the normal distribution, all symmetric distributions or all distributions?
Thanks
probability-distributions random-variables expectation
asked Jul 26 at 23:10
Bucephalus
432214
432214
add a comment |Â
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
3
down vote
accepted
The expected value $ E(X) = mu$ is often connected to the parameters of the distribution for instance
$$ X sim Bin(n,p), E(X) = np$$
$$ X sim Geom(p) , E(X) = p $$
$$ X sim NegBin(r,p) , E(X) = fracrp$$
$$ X sim Pois(lambda) , E(X) = lambda $$
$$ X sim N(mu, sigma^2) , E(X) = mu $$
$$ X sim Gamma(alpha,beta) ,E(X) = fracalphabeta$$
$$ X sim U(a,b) ,E(X) = fracb+a2 $$
$$ X sim Beta(alpha,beta) ,E(X) = fracalphaalpha +beta $$
So $ E(X) = mu$ is a function of them simply because of how the expected value is defined..
$$ E(X) = int_-infty^infty xf(x) dx$$
where $f(x) $ is our density function, for continous functions
for discrete random variables we have
$$ E(X) = sum_i x p(x_i) $$
It isn't surprising why we get the parameters of the distribution in our expectation because of this. For instance. Supppose $ X sim U(a,b) $. It's pdf is given by.
$$ f(x) =beginalignbegincases frac1b-a & textrm for a leq x leq b \ \ 0 & textrm for x <atextrm or x >b endcases endalign$$
$$ E(X) = int_-infty^infty x frac1b-adx $$
$$ E(X) = frac1b-aint_-infty^infty x dx $$
$$ E(X) = frac1b-aint_a^b x dx $$
$$ E(X) = frac1b-a fracx^22Big|_a^b$$
$$ E(X) = frac12(b-a) b^2-a^2$$
$$ E(X) = frac12(b-a) (b-a)(b+a)$$
$$ E(X) = fracb+a2 $$
Yes, i'm understanding more now thanks to both of you. Thankyou.
– Bucephalus
Jul 27 at 0:09
Youre welcome, no problem
– RHowe
Jul 27 at 0:19
add a comment |Â
up vote
1
down vote
The normal distribution $N(mu, sigma^2)$ is a probability distribution based on the pdf:
$f(x; mu, sigma^2) = frac1sqrt2pisigma^2e^frac(x - mu)^22sigma^2$
Nowhere in that definition is it guaranteed that $mu$ is the mean of the distribution, or that $sigma$ is its standard deviation. Both need to be proven.
With other probability distributions, there's no guarantee that they will be directly parameterised by their expected value. For example, the log-normal distribution $textLognormal(mu, sigma^2)$ has an expected value of $e^mu+fracsigma^22$.
I think i'm understanding what you're saying @ConMan. You're saying that we will use $E(X)$ to be our estimate of the population parameter $mu$ and we will do the same for $var(X)$ and $sigma^2$.
– Bucephalus
Jul 26 at 23:26
1
No, I'm saying that the probability distribution is defined by a function, which takes two parameters. The first parameter happens to be the expected value of the distribution, and the second one happens to be its variance. Similarly, the uniform distribution is parameterised by the start and end points, and its expected value is the average of the two.
– ConMan
Jul 26 at 23:30
1
And so I'm clear, neither of these are estimates. They are provably equal to the required values.
– ConMan
Jul 26 at 23:31
Oh yeah I get it now, the parameters of a distribution aren't necessarily the average and the variance. They just happen to be the parameters for the normal distribution. So that would probably be the case for any symmetric distribution too? For example, $x^2, -1 < x < 1$?
– Bucephalus
Jul 26 at 23:34
1
Nope, not necessarily. Like I said, the uniform distribution $U(a, b)$ is a flat line between the points $a$ and $b$. Its mean value is at $fraca+b2$, and it's symmetric about that point. Parameters are just "the values necessary to define the shape of the distribution", and they do different things in different distributions.
– ConMan
Jul 27 at 3:13
 |Â
show 2 more comments
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
3
down vote
accepted
The expected value $ E(X) = mu$ is often connected to the parameters of the distribution for instance
$$ X sim Bin(n,p), E(X) = np$$
$$ X sim Geom(p) , E(X) = p $$
$$ X sim NegBin(r,p) , E(X) = fracrp$$
$$ X sim Pois(lambda) , E(X) = lambda $$
$$ X sim N(mu, sigma^2) , E(X) = mu $$
$$ X sim Gamma(alpha,beta) ,E(X) = fracalphabeta$$
$$ X sim U(a,b) ,E(X) = fracb+a2 $$
$$ X sim Beta(alpha,beta) ,E(X) = fracalphaalpha +beta $$
So $ E(X) = mu$ is a function of them simply because of how the expected value is defined..
$$ E(X) = int_-infty^infty xf(x) dx$$
where $f(x) $ is our density function, for continous functions
for discrete random variables we have
$$ E(X) = sum_i x p(x_i) $$
It isn't surprising why we get the parameters of the distribution in our expectation because of this. For instance. Supppose $ X sim U(a,b) $. It's pdf is given by.
$$ f(x) =beginalignbegincases frac1b-a & textrm for a leq x leq b \ \ 0 & textrm for x <atextrm or x >b endcases endalign$$
$$ E(X) = int_-infty^infty x frac1b-adx $$
$$ E(X) = frac1b-aint_-infty^infty x dx $$
$$ E(X) = frac1b-aint_a^b x dx $$
$$ E(X) = frac1b-a fracx^22Big|_a^b$$
$$ E(X) = frac12(b-a) b^2-a^2$$
$$ E(X) = frac12(b-a) (b-a)(b+a)$$
$$ E(X) = fracb+a2 $$
Yes, i'm understanding more now thanks to both of you. Thankyou.
– Bucephalus
Jul 27 at 0:09
Youre welcome, no problem
– RHowe
Jul 27 at 0:19
add a comment |Â
up vote
3
down vote
accepted
The expected value $ E(X) = mu$ is often connected to the parameters of the distribution for instance
$$ X sim Bin(n,p), E(X) = np$$
$$ X sim Geom(p) , E(X) = p $$
$$ X sim NegBin(r,p) , E(X) = fracrp$$
$$ X sim Pois(lambda) , E(X) = lambda $$
$$ X sim N(mu, sigma^2) , E(X) = mu $$
$$ X sim Gamma(alpha,beta) ,E(X) = fracalphabeta$$
$$ X sim U(a,b) ,E(X) = fracb+a2 $$
$$ X sim Beta(alpha,beta) ,E(X) = fracalphaalpha +beta $$
So $ E(X) = mu$ is a function of them simply because of how the expected value is defined..
$$ E(X) = int_-infty^infty xf(x) dx$$
where $f(x) $ is our density function, for continous functions
for discrete random variables we have
$$ E(X) = sum_i x p(x_i) $$
It isn't surprising why we get the parameters of the distribution in our expectation because of this. For instance. Supppose $ X sim U(a,b) $. It's pdf is given by.
$$ f(x) =beginalignbegincases frac1b-a & textrm for a leq x leq b \ \ 0 & textrm for x <atextrm or x >b endcases endalign$$
$$ E(X) = int_-infty^infty x frac1b-adx $$
$$ E(X) = frac1b-aint_-infty^infty x dx $$
$$ E(X) = frac1b-aint_a^b x dx $$
$$ E(X) = frac1b-a fracx^22Big|_a^b$$
$$ E(X) = frac12(b-a) b^2-a^2$$
$$ E(X) = frac12(b-a) (b-a)(b+a)$$
$$ E(X) = fracb+a2 $$
Yes, i'm understanding more now thanks to both of you. Thankyou.
– Bucephalus
Jul 27 at 0:09
Youre welcome, no problem
– RHowe
Jul 27 at 0:19
add a comment |Â
up vote
3
down vote
accepted
up vote
3
down vote
accepted
The expected value $ E(X) = mu$ is often connected to the parameters of the distribution for instance
$$ X sim Bin(n,p), E(X) = np$$
$$ X sim Geom(p) , E(X) = p $$
$$ X sim NegBin(r,p) , E(X) = fracrp$$
$$ X sim Pois(lambda) , E(X) = lambda $$
$$ X sim N(mu, sigma^2) , E(X) = mu $$
$$ X sim Gamma(alpha,beta) ,E(X) = fracalphabeta$$
$$ X sim U(a,b) ,E(X) = fracb+a2 $$
$$ X sim Beta(alpha,beta) ,E(X) = fracalphaalpha +beta $$
So $ E(X) = mu$ is a function of them simply because of how the expected value is defined..
$$ E(X) = int_-infty^infty xf(x) dx$$
where $f(x) $ is our density function, for continous functions
for discrete random variables we have
$$ E(X) = sum_i x p(x_i) $$
It isn't surprising why we get the parameters of the distribution in our expectation because of this. For instance. Supppose $ X sim U(a,b) $. It's pdf is given by.
$$ f(x) =beginalignbegincases frac1b-a & textrm for a leq x leq b \ \ 0 & textrm for x <atextrm or x >b endcases endalign$$
$$ E(X) = int_-infty^infty x frac1b-adx $$
$$ E(X) = frac1b-aint_-infty^infty x dx $$
$$ E(X) = frac1b-aint_a^b x dx $$
$$ E(X) = frac1b-a fracx^22Big|_a^b$$
$$ E(X) = frac12(b-a) b^2-a^2$$
$$ E(X) = frac12(b-a) (b-a)(b+a)$$
$$ E(X) = fracb+a2 $$
The expected value $ E(X) = mu$ is often connected to the parameters of the distribution for instance
$$ X sim Bin(n,p), E(X) = np$$
$$ X sim Geom(p) , E(X) = p $$
$$ X sim NegBin(r,p) , E(X) = fracrp$$
$$ X sim Pois(lambda) , E(X) = lambda $$
$$ X sim N(mu, sigma^2) , E(X) = mu $$
$$ X sim Gamma(alpha,beta) ,E(X) = fracalphabeta$$
$$ X sim U(a,b) ,E(X) = fracb+a2 $$
$$ X sim Beta(alpha,beta) ,E(X) = fracalphaalpha +beta $$
So $ E(X) = mu$ is a function of them simply because of how the expected value is defined..
$$ E(X) = int_-infty^infty xf(x) dx$$
where $f(x) $ is our density function, for continous functions
for discrete random variables we have
$$ E(X) = sum_i x p(x_i) $$
It isn't surprising why we get the parameters of the distribution in our expectation because of this. For instance. Supppose $ X sim U(a,b) $. It's pdf is given by.
$$ f(x) =beginalignbegincases frac1b-a & textrm for a leq x leq b \ \ 0 & textrm for x <atextrm or x >b endcases endalign$$
$$ E(X) = int_-infty^infty x frac1b-adx $$
$$ E(X) = frac1b-aint_-infty^infty x dx $$
$$ E(X) = frac1b-aint_a^b x dx $$
$$ E(X) = frac1b-a fracx^22Big|_a^b$$
$$ E(X) = frac12(b-a) b^2-a^2$$
$$ E(X) = frac12(b-a) (b-a)(b+a)$$
$$ E(X) = fracb+a2 $$
edited Jul 27 at 0:06
answered Jul 26 at 23:51


RHowe
975715
975715
Yes, i'm understanding more now thanks to both of you. Thankyou.
– Bucephalus
Jul 27 at 0:09
Youre welcome, no problem
– RHowe
Jul 27 at 0:19
add a comment |Â
Yes, i'm understanding more now thanks to both of you. Thankyou.
– Bucephalus
Jul 27 at 0:09
Youre welcome, no problem
– RHowe
Jul 27 at 0:19
Yes, i'm understanding more now thanks to both of you. Thankyou.
– Bucephalus
Jul 27 at 0:09
Yes, i'm understanding more now thanks to both of you. Thankyou.
– Bucephalus
Jul 27 at 0:09
Youre welcome, no problem
– RHowe
Jul 27 at 0:19
Youre welcome, no problem
– RHowe
Jul 27 at 0:19
add a comment |Â
up vote
1
down vote
The normal distribution $N(mu, sigma^2)$ is a probability distribution based on the pdf:
$f(x; mu, sigma^2) = frac1sqrt2pisigma^2e^frac(x - mu)^22sigma^2$
Nowhere in that definition is it guaranteed that $mu$ is the mean of the distribution, or that $sigma$ is its standard deviation. Both need to be proven.
With other probability distributions, there's no guarantee that they will be directly parameterised by their expected value. For example, the log-normal distribution $textLognormal(mu, sigma^2)$ has an expected value of $e^mu+fracsigma^22$.
I think i'm understanding what you're saying @ConMan. You're saying that we will use $E(X)$ to be our estimate of the population parameter $mu$ and we will do the same for $var(X)$ and $sigma^2$.
– Bucephalus
Jul 26 at 23:26
1
No, I'm saying that the probability distribution is defined by a function, which takes two parameters. The first parameter happens to be the expected value of the distribution, and the second one happens to be its variance. Similarly, the uniform distribution is parameterised by the start and end points, and its expected value is the average of the two.
– ConMan
Jul 26 at 23:30
1
And so I'm clear, neither of these are estimates. They are provably equal to the required values.
– ConMan
Jul 26 at 23:31
Oh yeah I get it now, the parameters of a distribution aren't necessarily the average and the variance. They just happen to be the parameters for the normal distribution. So that would probably be the case for any symmetric distribution too? For example, $x^2, -1 < x < 1$?
– Bucephalus
Jul 26 at 23:34
1
Nope, not necessarily. Like I said, the uniform distribution $U(a, b)$ is a flat line between the points $a$ and $b$. Its mean value is at $fraca+b2$, and it's symmetric about that point. Parameters are just "the values necessary to define the shape of the distribution", and they do different things in different distributions.
– ConMan
Jul 27 at 3:13
 |Â
show 2 more comments
up vote
1
down vote
The normal distribution $N(mu, sigma^2)$ is a probability distribution based on the pdf:
$f(x; mu, sigma^2) = frac1sqrt2pisigma^2e^frac(x - mu)^22sigma^2$
Nowhere in that definition is it guaranteed that $mu$ is the mean of the distribution, or that $sigma$ is its standard deviation. Both need to be proven.
With other probability distributions, there's no guarantee that they will be directly parameterised by their expected value. For example, the log-normal distribution $textLognormal(mu, sigma^2)$ has an expected value of $e^mu+fracsigma^22$.
I think i'm understanding what you're saying @ConMan. You're saying that we will use $E(X)$ to be our estimate of the population parameter $mu$ and we will do the same for $var(X)$ and $sigma^2$.
– Bucephalus
Jul 26 at 23:26
1
No, I'm saying that the probability distribution is defined by a function, which takes two parameters. The first parameter happens to be the expected value of the distribution, and the second one happens to be its variance. Similarly, the uniform distribution is parameterised by the start and end points, and its expected value is the average of the two.
– ConMan
Jul 26 at 23:30
1
And so I'm clear, neither of these are estimates. They are provably equal to the required values.
– ConMan
Jul 26 at 23:31
Oh yeah I get it now, the parameters of a distribution aren't necessarily the average and the variance. They just happen to be the parameters for the normal distribution. So that would probably be the case for any symmetric distribution too? For example, $x^2, -1 < x < 1$?
– Bucephalus
Jul 26 at 23:34
1
Nope, not necessarily. Like I said, the uniform distribution $U(a, b)$ is a flat line between the points $a$ and $b$. Its mean value is at $fraca+b2$, and it's symmetric about that point. Parameters are just "the values necessary to define the shape of the distribution", and they do different things in different distributions.
– ConMan
Jul 27 at 3:13
 |Â
show 2 more comments
up vote
1
down vote
up vote
1
down vote
The normal distribution $N(mu, sigma^2)$ is a probability distribution based on the pdf:
$f(x; mu, sigma^2) = frac1sqrt2pisigma^2e^frac(x - mu)^22sigma^2$
Nowhere in that definition is it guaranteed that $mu$ is the mean of the distribution, or that $sigma$ is its standard deviation. Both need to be proven.
With other probability distributions, there's no guarantee that they will be directly parameterised by their expected value. For example, the log-normal distribution $textLognormal(mu, sigma^2)$ has an expected value of $e^mu+fracsigma^22$.
The normal distribution $N(mu, sigma^2)$ is a probability distribution based on the pdf:
$f(x; mu, sigma^2) = frac1sqrt2pisigma^2e^frac(x - mu)^22sigma^2$
Nowhere in that definition is it guaranteed that $mu$ is the mean of the distribution, or that $sigma$ is its standard deviation. Both need to be proven.
With other probability distributions, there's no guarantee that they will be directly parameterised by their expected value. For example, the log-normal distribution $textLognormal(mu, sigma^2)$ has an expected value of $e^mu+fracsigma^22$.
answered Jul 26 at 23:18
ConMan
6,9351324
6,9351324
I think i'm understanding what you're saying @ConMan. You're saying that we will use $E(X)$ to be our estimate of the population parameter $mu$ and we will do the same for $var(X)$ and $sigma^2$.
– Bucephalus
Jul 26 at 23:26
1
No, I'm saying that the probability distribution is defined by a function, which takes two parameters. The first parameter happens to be the expected value of the distribution, and the second one happens to be its variance. Similarly, the uniform distribution is parameterised by the start and end points, and its expected value is the average of the two.
– ConMan
Jul 26 at 23:30
1
And so I'm clear, neither of these are estimates. They are provably equal to the required values.
– ConMan
Jul 26 at 23:31
Oh yeah I get it now, the parameters of a distribution aren't necessarily the average and the variance. They just happen to be the parameters for the normal distribution. So that would probably be the case for any symmetric distribution too? For example, $x^2, -1 < x < 1$?
– Bucephalus
Jul 26 at 23:34
1
Nope, not necessarily. Like I said, the uniform distribution $U(a, b)$ is a flat line between the points $a$ and $b$. Its mean value is at $fraca+b2$, and it's symmetric about that point. Parameters are just "the values necessary to define the shape of the distribution", and they do different things in different distributions.
– ConMan
Jul 27 at 3:13
 |Â
show 2 more comments
I think i'm understanding what you're saying @ConMan. You're saying that we will use $E(X)$ to be our estimate of the population parameter $mu$ and we will do the same for $var(X)$ and $sigma^2$.
– Bucephalus
Jul 26 at 23:26
1
No, I'm saying that the probability distribution is defined by a function, which takes two parameters. The first parameter happens to be the expected value of the distribution, and the second one happens to be its variance. Similarly, the uniform distribution is parameterised by the start and end points, and its expected value is the average of the two.
– ConMan
Jul 26 at 23:30
1
And so I'm clear, neither of these are estimates. They are provably equal to the required values.
– ConMan
Jul 26 at 23:31
Oh yeah I get it now, the parameters of a distribution aren't necessarily the average and the variance. They just happen to be the parameters for the normal distribution. So that would probably be the case for any symmetric distribution too? For example, $x^2, -1 < x < 1$?
– Bucephalus
Jul 26 at 23:34
1
Nope, not necessarily. Like I said, the uniform distribution $U(a, b)$ is a flat line between the points $a$ and $b$. Its mean value is at $fraca+b2$, and it's symmetric about that point. Parameters are just "the values necessary to define the shape of the distribution", and they do different things in different distributions.
– ConMan
Jul 27 at 3:13
I think i'm understanding what you're saying @ConMan. You're saying that we will use $E(X)$ to be our estimate of the population parameter $mu$ and we will do the same for $var(X)$ and $sigma^2$.
– Bucephalus
Jul 26 at 23:26
I think i'm understanding what you're saying @ConMan. You're saying that we will use $E(X)$ to be our estimate of the population parameter $mu$ and we will do the same for $var(X)$ and $sigma^2$.
– Bucephalus
Jul 26 at 23:26
1
1
No, I'm saying that the probability distribution is defined by a function, which takes two parameters. The first parameter happens to be the expected value of the distribution, and the second one happens to be its variance. Similarly, the uniform distribution is parameterised by the start and end points, and its expected value is the average of the two.
– ConMan
Jul 26 at 23:30
No, I'm saying that the probability distribution is defined by a function, which takes two parameters. The first parameter happens to be the expected value of the distribution, and the second one happens to be its variance. Similarly, the uniform distribution is parameterised by the start and end points, and its expected value is the average of the two.
– ConMan
Jul 26 at 23:30
1
1
And so I'm clear, neither of these are estimates. They are provably equal to the required values.
– ConMan
Jul 26 at 23:31
And so I'm clear, neither of these are estimates. They are provably equal to the required values.
– ConMan
Jul 26 at 23:31
Oh yeah I get it now, the parameters of a distribution aren't necessarily the average and the variance. They just happen to be the parameters for the normal distribution. So that would probably be the case for any symmetric distribution too? For example, $x^2, -1 < x < 1$?
– Bucephalus
Jul 26 at 23:34
Oh yeah I get it now, the parameters of a distribution aren't necessarily the average and the variance. They just happen to be the parameters for the normal distribution. So that would probably be the case for any symmetric distribution too? For example, $x^2, -1 < x < 1$?
– Bucephalus
Jul 26 at 23:34
1
1
Nope, not necessarily. Like I said, the uniform distribution $U(a, b)$ is a flat line between the points $a$ and $b$. Its mean value is at $fraca+b2$, and it's symmetric about that point. Parameters are just "the values necessary to define the shape of the distribution", and they do different things in different distributions.
– ConMan
Jul 27 at 3:13
Nope, not necessarily. Like I said, the uniform distribution $U(a, b)$ is a flat line between the points $a$ and $b$. Its mean value is at $fraca+b2$, and it's symmetric about that point. Parameters are just "the values necessary to define the shape of the distribution", and they do different things in different distributions.
– ConMan
Jul 27 at 3:13
 |Â
show 2 more comments
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2863911%2fexpected-value-and-distributions%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password