Mean squared error calculation [closed]

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












If $ X_1,...,X_n$ ~ $N(mu, sigma^2)$ where $mu$ is known and $sigma^2$ is unknown, calculate the MSE of $V^2$



$V^2 = frac1n sum_X_i^n Var(X_i) =sigma^2$



Therefore:



$MSE(V^2) = Var(V^2) = frac1n^2nVar[(X_1-mu)^2]=frac1nVar[sigma^2(fracX_1-musigma)^2]=frac1nsigma^4Var[(fracX_1-musigma)^2]=frac2sigma^4n$



However, I do not understand some of the steps:



  1. Where does the $X_1$ suddenly come from (instead of$ X_i$)?

  2. And then in the next step, I am aware it has somehing to do with the fact that $fracX-musigma$ ~ $chi^2_1$ But i cannot connect the dots .

Could someone break these down for me ? I do not have an mathematical background, therefore stating the obvious is very welcome.







share|cite|improve this question











closed as unclear what you're asking by StubbornAtom, Mostafa Ayaz, José Carlos Santos, user223391, Parcly Taxel Jul 16 at 4:26


Please clarify your specific problem or add additional details to highlight exactly what you need. As it's currently written, it’s hard to tell exactly what you're asking. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.










  • 2




    If $X_i sim N(mu, sigma^2)$ then what is the difference between $Var(X_i)$ and $sigma^2$? Are the $X_i$ independent? Also, what is the meaning of $sum_X_i^n$? Finally, it looks like you are defining $V^2$ as the constant $sigma^2$ so there is no estimation going on and the variance of $V^2$ is 0. I suspect you are incorrectly interpreting a problem you are given, I would expect $V^2$ to be some estimate formed from the $X_1, ..., X_n$ samples. My best guess at the correct definition of $V^2$ is $$V^2 := frac1nsum_i=1^n (X_i-mu)^2$$
    – Michael
    Jul 15 at 10:45











  • Agree with @Michael. // If $hattau$ is estimator of $tau,$ the $MSE(hat tau) = E[(tau - hat tau)^2|.$ Also, $MSE(hat tau) = Var(hat tau),$ provided that $E(hattau) = tau$ (that is provided that $hat tau$ is unbiased). // Finally, for $X_1 sim mathsfNorm(mu, sigma)$ one has $left(fracX_1 - musigmaright)^2 sim mathsfChisq(1).$ You can look up info on mean and variance of chi-squared dist'n on Wikipleda or your text.
    – BruceET
    Jul 15 at 19:03










  • Except for the bad start defining $V^2$ incorrectly, you have pretty much the right idea. @Michael showed you how to start. Hope my Answ gives you clues how to fix the derivation.
    – BruceET
    Jul 15 at 23:10















up vote
1
down vote

favorite












If $ X_1,...,X_n$ ~ $N(mu, sigma^2)$ where $mu$ is known and $sigma^2$ is unknown, calculate the MSE of $V^2$



$V^2 = frac1n sum_X_i^n Var(X_i) =sigma^2$



Therefore:



$MSE(V^2) = Var(V^2) = frac1n^2nVar[(X_1-mu)^2]=frac1nVar[sigma^2(fracX_1-musigma)^2]=frac1nsigma^4Var[(fracX_1-musigma)^2]=frac2sigma^4n$



However, I do not understand some of the steps:



  1. Where does the $X_1$ suddenly come from (instead of$ X_i$)?

  2. And then in the next step, I am aware it has somehing to do with the fact that $fracX-musigma$ ~ $chi^2_1$ But i cannot connect the dots .

Could someone break these down for me ? I do not have an mathematical background, therefore stating the obvious is very welcome.







share|cite|improve this question











closed as unclear what you're asking by StubbornAtom, Mostafa Ayaz, José Carlos Santos, user223391, Parcly Taxel Jul 16 at 4:26


Please clarify your specific problem or add additional details to highlight exactly what you need. As it's currently written, it’s hard to tell exactly what you're asking. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.










  • 2




    If $X_i sim N(mu, sigma^2)$ then what is the difference between $Var(X_i)$ and $sigma^2$? Are the $X_i$ independent? Also, what is the meaning of $sum_X_i^n$? Finally, it looks like you are defining $V^2$ as the constant $sigma^2$ so there is no estimation going on and the variance of $V^2$ is 0. I suspect you are incorrectly interpreting a problem you are given, I would expect $V^2$ to be some estimate formed from the $X_1, ..., X_n$ samples. My best guess at the correct definition of $V^2$ is $$V^2 := frac1nsum_i=1^n (X_i-mu)^2$$
    – Michael
    Jul 15 at 10:45











  • Agree with @Michael. // If $hattau$ is estimator of $tau,$ the $MSE(hat tau) = E[(tau - hat tau)^2|.$ Also, $MSE(hat tau) = Var(hat tau),$ provided that $E(hattau) = tau$ (that is provided that $hat tau$ is unbiased). // Finally, for $X_1 sim mathsfNorm(mu, sigma)$ one has $left(fracX_1 - musigmaright)^2 sim mathsfChisq(1).$ You can look up info on mean and variance of chi-squared dist'n on Wikipleda or your text.
    – BruceET
    Jul 15 at 19:03










  • Except for the bad start defining $V^2$ incorrectly, you have pretty much the right idea. @Michael showed you how to start. Hope my Answ gives you clues how to fix the derivation.
    – BruceET
    Jul 15 at 23:10













up vote
1
down vote

favorite









up vote
1
down vote

favorite











If $ X_1,...,X_n$ ~ $N(mu, sigma^2)$ where $mu$ is known and $sigma^2$ is unknown, calculate the MSE of $V^2$



$V^2 = frac1n sum_X_i^n Var(X_i) =sigma^2$



Therefore:



$MSE(V^2) = Var(V^2) = frac1n^2nVar[(X_1-mu)^2]=frac1nVar[sigma^2(fracX_1-musigma)^2]=frac1nsigma^4Var[(fracX_1-musigma)^2]=frac2sigma^4n$



However, I do not understand some of the steps:



  1. Where does the $X_1$ suddenly come from (instead of$ X_i$)?

  2. And then in the next step, I am aware it has somehing to do with the fact that $fracX-musigma$ ~ $chi^2_1$ But i cannot connect the dots .

Could someone break these down for me ? I do not have an mathematical background, therefore stating the obvious is very welcome.







share|cite|improve this question











If $ X_1,...,X_n$ ~ $N(mu, sigma^2)$ where $mu$ is known and $sigma^2$ is unknown, calculate the MSE of $V^2$



$V^2 = frac1n sum_X_i^n Var(X_i) =sigma^2$



Therefore:



$MSE(V^2) = Var(V^2) = frac1n^2nVar[(X_1-mu)^2]=frac1nVar[sigma^2(fracX_1-musigma)^2]=frac1nsigma^4Var[(fracX_1-musigma)^2]=frac2sigma^4n$



However, I do not understand some of the steps:



  1. Where does the $X_1$ suddenly come from (instead of$ X_i$)?

  2. And then in the next step, I am aware it has somehing to do with the fact that $fracX-musigma$ ~ $chi^2_1$ But i cannot connect the dots .

Could someone break these down for me ? I do not have an mathematical background, therefore stating the obvious is very welcome.









share|cite|improve this question










share|cite|improve this question




share|cite|improve this question









asked Jul 15 at 9:43









Danka

178




178




closed as unclear what you're asking by StubbornAtom, Mostafa Ayaz, José Carlos Santos, user223391, Parcly Taxel Jul 16 at 4:26


Please clarify your specific problem or add additional details to highlight exactly what you need. As it's currently written, it’s hard to tell exactly what you're asking. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.






closed as unclear what you're asking by StubbornAtom, Mostafa Ayaz, José Carlos Santos, user223391, Parcly Taxel Jul 16 at 4:26


Please clarify your specific problem or add additional details to highlight exactly what you need. As it's currently written, it’s hard to tell exactly what you're asking. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.









  • 2




    If $X_i sim N(mu, sigma^2)$ then what is the difference between $Var(X_i)$ and $sigma^2$? Are the $X_i$ independent? Also, what is the meaning of $sum_X_i^n$? Finally, it looks like you are defining $V^2$ as the constant $sigma^2$ so there is no estimation going on and the variance of $V^2$ is 0. I suspect you are incorrectly interpreting a problem you are given, I would expect $V^2$ to be some estimate formed from the $X_1, ..., X_n$ samples. My best guess at the correct definition of $V^2$ is $$V^2 := frac1nsum_i=1^n (X_i-mu)^2$$
    – Michael
    Jul 15 at 10:45











  • Agree with @Michael. // If $hattau$ is estimator of $tau,$ the $MSE(hat tau) = E[(tau - hat tau)^2|.$ Also, $MSE(hat tau) = Var(hat tau),$ provided that $E(hattau) = tau$ (that is provided that $hat tau$ is unbiased). // Finally, for $X_1 sim mathsfNorm(mu, sigma)$ one has $left(fracX_1 - musigmaright)^2 sim mathsfChisq(1).$ You can look up info on mean and variance of chi-squared dist'n on Wikipleda or your text.
    – BruceET
    Jul 15 at 19:03










  • Except for the bad start defining $V^2$ incorrectly, you have pretty much the right idea. @Michael showed you how to start. Hope my Answ gives you clues how to fix the derivation.
    – BruceET
    Jul 15 at 23:10













  • 2




    If $X_i sim N(mu, sigma^2)$ then what is the difference between $Var(X_i)$ and $sigma^2$? Are the $X_i$ independent? Also, what is the meaning of $sum_X_i^n$? Finally, it looks like you are defining $V^2$ as the constant $sigma^2$ so there is no estimation going on and the variance of $V^2$ is 0. I suspect you are incorrectly interpreting a problem you are given, I would expect $V^2$ to be some estimate formed from the $X_1, ..., X_n$ samples. My best guess at the correct definition of $V^2$ is $$V^2 := frac1nsum_i=1^n (X_i-mu)^2$$
    – Michael
    Jul 15 at 10:45











  • Agree with @Michael. // If $hattau$ is estimator of $tau,$ the $MSE(hat tau) = E[(tau - hat tau)^2|.$ Also, $MSE(hat tau) = Var(hat tau),$ provided that $E(hattau) = tau$ (that is provided that $hat tau$ is unbiased). // Finally, for $X_1 sim mathsfNorm(mu, sigma)$ one has $left(fracX_1 - musigmaright)^2 sim mathsfChisq(1).$ You can look up info on mean and variance of chi-squared dist'n on Wikipleda or your text.
    – BruceET
    Jul 15 at 19:03










  • Except for the bad start defining $V^2$ incorrectly, you have pretty much the right idea. @Michael showed you how to start. Hope my Answ gives you clues how to fix the derivation.
    – BruceET
    Jul 15 at 23:10








2




2




If $X_i sim N(mu, sigma^2)$ then what is the difference between $Var(X_i)$ and $sigma^2$? Are the $X_i$ independent? Also, what is the meaning of $sum_X_i^n$? Finally, it looks like you are defining $V^2$ as the constant $sigma^2$ so there is no estimation going on and the variance of $V^2$ is 0. I suspect you are incorrectly interpreting a problem you are given, I would expect $V^2$ to be some estimate formed from the $X_1, ..., X_n$ samples. My best guess at the correct definition of $V^2$ is $$V^2 := frac1nsum_i=1^n (X_i-mu)^2$$
– Michael
Jul 15 at 10:45





If $X_i sim N(mu, sigma^2)$ then what is the difference between $Var(X_i)$ and $sigma^2$? Are the $X_i$ independent? Also, what is the meaning of $sum_X_i^n$? Finally, it looks like you are defining $V^2$ as the constant $sigma^2$ so there is no estimation going on and the variance of $V^2$ is 0. I suspect you are incorrectly interpreting a problem you are given, I would expect $V^2$ to be some estimate formed from the $X_1, ..., X_n$ samples. My best guess at the correct definition of $V^2$ is $$V^2 := frac1nsum_i=1^n (X_i-mu)^2$$
– Michael
Jul 15 at 10:45













Agree with @Michael. // If $hattau$ is estimator of $tau,$ the $MSE(hat tau) = E[(tau - hat tau)^2|.$ Also, $MSE(hat tau) = Var(hat tau),$ provided that $E(hattau) = tau$ (that is provided that $hat tau$ is unbiased). // Finally, for $X_1 sim mathsfNorm(mu, sigma)$ one has $left(fracX_1 - musigmaright)^2 sim mathsfChisq(1).$ You can look up info on mean and variance of chi-squared dist'n on Wikipleda or your text.
– BruceET
Jul 15 at 19:03




Agree with @Michael. // If $hattau$ is estimator of $tau,$ the $MSE(hat tau) = E[(tau - hat tau)^2|.$ Also, $MSE(hat tau) = Var(hat tau),$ provided that $E(hattau) = tau$ (that is provided that $hat tau$ is unbiased). // Finally, for $X_1 sim mathsfNorm(mu, sigma)$ one has $left(fracX_1 - musigmaright)^2 sim mathsfChisq(1).$ You can look up info on mean and variance of chi-squared dist'n on Wikipleda or your text.
– BruceET
Jul 15 at 19:03












Except for the bad start defining $V^2$ incorrectly, you have pretty much the right idea. @Michael showed you how to start. Hope my Answ gives you clues how to fix the derivation.
– BruceET
Jul 15 at 23:10





Except for the bad start defining $V^2$ incorrectly, you have pretty much the right idea. @Michael showed you how to start. Hope my Answ gives you clues how to fix the derivation.
– BruceET
Jul 15 at 23:10











1 Answer
1






active

oldest

votes

















up vote
1
down vote



accepted










Let $X_1, X_2, dots X_n$ be a random sample from $mathsfNorm(mu, sigma),$ where $mu$ is known and $sigma^2$ is to be estimated by
$V = frac 1 nsum_i=1^n (X_i - mu)^2.$ (Note the use of the known population mean $mu,$ not the sample mean $bar X.)$ You want to evaluate
$MSE(V).$ @Michael and I have given you some hints. (Notice that my $V$ is
your $V^2$ to simplify notation a bit.)



With that orientation, I hope the following example with specific numbers for the quantities involved will help you do the required general derivation.



Suppose $n = 5,, mu = 0$ and $sigma = 4.$ Then
$Q = fracnVsigma^2 = frac5V16 sim mathsfChisq(n=5),$ which has mean $n=5$ and variance $2n=10.$ So $E(V) = fracsigma^2nn = 16,$ (showing that $V$ is unbiased for $sigma^2)$ and
$Var(V) = MSE(V) = fracsigma^4n^22n = 102.4.$



The following demonstration, using R statistical software, with a million such samples of size $n=5$
illustrates these numerical results to several significant digits. In
the program MAT is a $10^6 times 5$ matrix, in which each row is a sample
of size $5.$



set.seed(715) # retain for exactly same simulation, delete for fresh run
m = 10^6; n = 5; mu = 0; sg = 4
x = rnorm(m*n, mu, sg); MAT = matrix(x, nrow = m)
v = rowMeans((MAT - mu)^2) # using 'known' population mean, not sample mean
mean(v); mean((v-sg^2)^2)
[1] 15.99998 # aprx E(V) = 16
[1] 102.5 # aprs MSE(V) = 102.4


The plot below shows the simulated distribution of
$Q = fracnVsigma^2 = frac5V16 = 0.3125V$ along with the density curve of $mathsfChisq(5).$



hist(5*v/sg^2, prob=T, br=40, xlab="q", col="skyblue2", main="")
curve(dchisq(x, 5), add=T, lwd=2, n=1001)


enter image description here






share|cite|improve this answer






























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    1
    down vote



    accepted










    Let $X_1, X_2, dots X_n$ be a random sample from $mathsfNorm(mu, sigma),$ where $mu$ is known and $sigma^2$ is to be estimated by
    $V = frac 1 nsum_i=1^n (X_i - mu)^2.$ (Note the use of the known population mean $mu,$ not the sample mean $bar X.)$ You want to evaluate
    $MSE(V).$ @Michael and I have given you some hints. (Notice that my $V$ is
    your $V^2$ to simplify notation a bit.)



    With that orientation, I hope the following example with specific numbers for the quantities involved will help you do the required general derivation.



    Suppose $n = 5,, mu = 0$ and $sigma = 4.$ Then
    $Q = fracnVsigma^2 = frac5V16 sim mathsfChisq(n=5),$ which has mean $n=5$ and variance $2n=10.$ So $E(V) = fracsigma^2nn = 16,$ (showing that $V$ is unbiased for $sigma^2)$ and
    $Var(V) = MSE(V) = fracsigma^4n^22n = 102.4.$



    The following demonstration, using R statistical software, with a million such samples of size $n=5$
    illustrates these numerical results to several significant digits. In
    the program MAT is a $10^6 times 5$ matrix, in which each row is a sample
    of size $5.$



    set.seed(715) # retain for exactly same simulation, delete for fresh run
    m = 10^6; n = 5; mu = 0; sg = 4
    x = rnorm(m*n, mu, sg); MAT = matrix(x, nrow = m)
    v = rowMeans((MAT - mu)^2) # using 'known' population mean, not sample mean
    mean(v); mean((v-sg^2)^2)
    [1] 15.99998 # aprx E(V) = 16
    [1] 102.5 # aprs MSE(V) = 102.4


    The plot below shows the simulated distribution of
    $Q = fracnVsigma^2 = frac5V16 = 0.3125V$ along with the density curve of $mathsfChisq(5).$



    hist(5*v/sg^2, prob=T, br=40, xlab="q", col="skyblue2", main="")
    curve(dchisq(x, 5), add=T, lwd=2, n=1001)


    enter image description here






    share|cite|improve this answer



























      up vote
      1
      down vote



      accepted










      Let $X_1, X_2, dots X_n$ be a random sample from $mathsfNorm(mu, sigma),$ where $mu$ is known and $sigma^2$ is to be estimated by
      $V = frac 1 nsum_i=1^n (X_i - mu)^2.$ (Note the use of the known population mean $mu,$ not the sample mean $bar X.)$ You want to evaluate
      $MSE(V).$ @Michael and I have given you some hints. (Notice that my $V$ is
      your $V^2$ to simplify notation a bit.)



      With that orientation, I hope the following example with specific numbers for the quantities involved will help you do the required general derivation.



      Suppose $n = 5,, mu = 0$ and $sigma = 4.$ Then
      $Q = fracnVsigma^2 = frac5V16 sim mathsfChisq(n=5),$ which has mean $n=5$ and variance $2n=10.$ So $E(V) = fracsigma^2nn = 16,$ (showing that $V$ is unbiased for $sigma^2)$ and
      $Var(V) = MSE(V) = fracsigma^4n^22n = 102.4.$



      The following demonstration, using R statistical software, with a million such samples of size $n=5$
      illustrates these numerical results to several significant digits. In
      the program MAT is a $10^6 times 5$ matrix, in which each row is a sample
      of size $5.$



      set.seed(715) # retain for exactly same simulation, delete for fresh run
      m = 10^6; n = 5; mu = 0; sg = 4
      x = rnorm(m*n, mu, sg); MAT = matrix(x, nrow = m)
      v = rowMeans((MAT - mu)^2) # using 'known' population mean, not sample mean
      mean(v); mean((v-sg^2)^2)
      [1] 15.99998 # aprx E(V) = 16
      [1] 102.5 # aprs MSE(V) = 102.4


      The plot below shows the simulated distribution of
      $Q = fracnVsigma^2 = frac5V16 = 0.3125V$ along with the density curve of $mathsfChisq(5).$



      hist(5*v/sg^2, prob=T, br=40, xlab="q", col="skyblue2", main="")
      curve(dchisq(x, 5), add=T, lwd=2, n=1001)


      enter image description here






      share|cite|improve this answer

























        up vote
        1
        down vote



        accepted







        up vote
        1
        down vote



        accepted






        Let $X_1, X_2, dots X_n$ be a random sample from $mathsfNorm(mu, sigma),$ where $mu$ is known and $sigma^2$ is to be estimated by
        $V = frac 1 nsum_i=1^n (X_i - mu)^2.$ (Note the use of the known population mean $mu,$ not the sample mean $bar X.)$ You want to evaluate
        $MSE(V).$ @Michael and I have given you some hints. (Notice that my $V$ is
        your $V^2$ to simplify notation a bit.)



        With that orientation, I hope the following example with specific numbers for the quantities involved will help you do the required general derivation.



        Suppose $n = 5,, mu = 0$ and $sigma = 4.$ Then
        $Q = fracnVsigma^2 = frac5V16 sim mathsfChisq(n=5),$ which has mean $n=5$ and variance $2n=10.$ So $E(V) = fracsigma^2nn = 16,$ (showing that $V$ is unbiased for $sigma^2)$ and
        $Var(V) = MSE(V) = fracsigma^4n^22n = 102.4.$



        The following demonstration, using R statistical software, with a million such samples of size $n=5$
        illustrates these numerical results to several significant digits. In
        the program MAT is a $10^6 times 5$ matrix, in which each row is a sample
        of size $5.$



        set.seed(715) # retain for exactly same simulation, delete for fresh run
        m = 10^6; n = 5; mu = 0; sg = 4
        x = rnorm(m*n, mu, sg); MAT = matrix(x, nrow = m)
        v = rowMeans((MAT - mu)^2) # using 'known' population mean, not sample mean
        mean(v); mean((v-sg^2)^2)
        [1] 15.99998 # aprx E(V) = 16
        [1] 102.5 # aprs MSE(V) = 102.4


        The plot below shows the simulated distribution of
        $Q = fracnVsigma^2 = frac5V16 = 0.3125V$ along with the density curve of $mathsfChisq(5).$



        hist(5*v/sg^2, prob=T, br=40, xlab="q", col="skyblue2", main="")
        curve(dchisq(x, 5), add=T, lwd=2, n=1001)


        enter image description here






        share|cite|improve this answer















        Let $X_1, X_2, dots X_n$ be a random sample from $mathsfNorm(mu, sigma),$ where $mu$ is known and $sigma^2$ is to be estimated by
        $V = frac 1 nsum_i=1^n (X_i - mu)^2.$ (Note the use of the known population mean $mu,$ not the sample mean $bar X.)$ You want to evaluate
        $MSE(V).$ @Michael and I have given you some hints. (Notice that my $V$ is
        your $V^2$ to simplify notation a bit.)



        With that orientation, I hope the following example with specific numbers for the quantities involved will help you do the required general derivation.



        Suppose $n = 5,, mu = 0$ and $sigma = 4.$ Then
        $Q = fracnVsigma^2 = frac5V16 sim mathsfChisq(n=5),$ which has mean $n=5$ and variance $2n=10.$ So $E(V) = fracsigma^2nn = 16,$ (showing that $V$ is unbiased for $sigma^2)$ and
        $Var(V) = MSE(V) = fracsigma^4n^22n = 102.4.$



        The following demonstration, using R statistical software, with a million such samples of size $n=5$
        illustrates these numerical results to several significant digits. In
        the program MAT is a $10^6 times 5$ matrix, in which each row is a sample
        of size $5.$



        set.seed(715) # retain for exactly same simulation, delete for fresh run
        m = 10^6; n = 5; mu = 0; sg = 4
        x = rnorm(m*n, mu, sg); MAT = matrix(x, nrow = m)
        v = rowMeans((MAT - mu)^2) # using 'known' population mean, not sample mean
        mean(v); mean((v-sg^2)^2)
        [1] 15.99998 # aprx E(V) = 16
        [1] 102.5 # aprs MSE(V) = 102.4


        The plot below shows the simulated distribution of
        $Q = fracnVsigma^2 = frac5V16 = 0.3125V$ along with the density curve of $mathsfChisq(5).$



        hist(5*v/sg^2, prob=T, br=40, xlab="q", col="skyblue2", main="")
        curve(dchisq(x, 5), add=T, lwd=2, n=1001)


        enter image description here







        share|cite|improve this answer















        share|cite|improve this answer



        share|cite|improve this answer








        edited Jul 15 at 23:36


























        answered Jul 15 at 22:52









        BruceET

        33.3k61440




        33.3k61440












            Comments

            Popular posts from this blog

            What is the equation of a 3D cone with generalised tilt?

            Color the edges and diagonals of a regular polygon

            Relationship between determinant of matrix and determinant of adjoint?