Show that $hatbeta $ and $hatsigma^2$ are unbiased in special case of linear regression model [closed]

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












Let's consider we have OLS model $Y= Xbeta+ epsilon$, where rows of matrix $X$ are multivariate normal independent vectors with expected value $0$ and variance $Sigma$. Vector $epsilon$ is independent of X and its mean value is zero ($E(epsilon) =0$) and $Var(epsilon)=sigma^2 I$.



I need to show that $hatbeta = (X^T X)^-1 X^T Y$ and $hatsigma^2= frac1rank(I-H) sum_i=1^n hatepsilon_i^2$, where $H=X(X^T X)^-1X^T$, are still unbiased estimators.



I would really appreciate any help.







share|cite|improve this question













closed as off-topic by Davide Giraudo, amWhy, Nils Matthes, callculus, Parcly Taxel Jul 19 at 1:11


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – Davide Giraudo, amWhy, Nils Matthes, callculus, Parcly Taxel
If this question can be reworded to fit the rules in the help center, please edit the question.
















    up vote
    0
    down vote

    favorite












    Let's consider we have OLS model $Y= Xbeta+ epsilon$, where rows of matrix $X$ are multivariate normal independent vectors with expected value $0$ and variance $Sigma$. Vector $epsilon$ is independent of X and its mean value is zero ($E(epsilon) =0$) and $Var(epsilon)=sigma^2 I$.



    I need to show that $hatbeta = (X^T X)^-1 X^T Y$ and $hatsigma^2= frac1rank(I-H) sum_i=1^n hatepsilon_i^2$, where $H=X(X^T X)^-1X^T$, are still unbiased estimators.



    I would really appreciate any help.







    share|cite|improve this question













    closed as off-topic by Davide Giraudo, amWhy, Nils Matthes, callculus, Parcly Taxel Jul 19 at 1:11


    This question appears to be off-topic. The users who voted to close gave this specific reason:


    • "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – Davide Giraudo, amWhy, Nils Matthes, callculus, Parcly Taxel
    If this question can be reworded to fit the rules in the help center, please edit the question.














      up vote
      0
      down vote

      favorite









      up vote
      0
      down vote

      favorite











      Let's consider we have OLS model $Y= Xbeta+ epsilon$, where rows of matrix $X$ are multivariate normal independent vectors with expected value $0$ and variance $Sigma$. Vector $epsilon$ is independent of X and its mean value is zero ($E(epsilon) =0$) and $Var(epsilon)=sigma^2 I$.



      I need to show that $hatbeta = (X^T X)^-1 X^T Y$ and $hatsigma^2= frac1rank(I-H) sum_i=1^n hatepsilon_i^2$, where $H=X(X^T X)^-1X^T$, are still unbiased estimators.



      I would really appreciate any help.







      share|cite|improve this question













      Let's consider we have OLS model $Y= Xbeta+ epsilon$, where rows of matrix $X$ are multivariate normal independent vectors with expected value $0$ and variance $Sigma$. Vector $epsilon$ is independent of X and its mean value is zero ($E(epsilon) =0$) and $Var(epsilon)=sigma^2 I$.



      I need to show that $hatbeta = (X^T X)^-1 X^T Y$ and $hatsigma^2= frac1rank(I-H) sum_i=1^n hatepsilon_i^2$, where $H=X(X^T X)^-1X^T$, are still unbiased estimators.



      I would really appreciate any help.









      share|cite|improve this question












      share|cite|improve this question




      share|cite|improve this question








      edited Jul 12 at 15:21









      V. Vancak

      9,8202926




      9,8202926









      asked Jul 11 at 17:17









      User1999

      549




      549




      closed as off-topic by Davide Giraudo, amWhy, Nils Matthes, callculus, Parcly Taxel Jul 19 at 1:11


      This question appears to be off-topic. The users who voted to close gave this specific reason:


      • "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – Davide Giraudo, amWhy, Nils Matthes, callculus, Parcly Taxel
      If this question can be reworded to fit the rules in the help center, please edit the question.




      closed as off-topic by Davide Giraudo, amWhy, Nils Matthes, callculus, Parcly Taxel Jul 19 at 1:11


      This question appears to be off-topic. The users who voted to close gave this specific reason:


      • "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – Davide Giraudo, amWhy, Nils Matthes, callculus, Parcly Taxel
      If this question can be reworded to fit the rules in the help center, please edit the question.




















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          1
          down vote













          $$
          mathbbE[hatbeta|X]= mathbbE[(X'X)^-1X'y|X] = (X'X)^-1X'(Xbeta + mathbbE[epsilon|X])= (X'X)^-1(X'X)beta = beta.
          $$



          $$
          frac1sigma ^ 2sum_i=1^n hatepsilon^2_i sim chi^2_n-p
          $$
          and
          $$
          rank(I-H)=n-p,
          $$



          hence,
          $$
          mathbbE[hatsigma^2|X] = fracsigma^2n-p mathbbE[ chi^2_n-p |X] = fracsigma^2(n-p)n-p = sigma^2.
          $$






          share|cite|improve this answer




























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes








            up vote
            1
            down vote













            $$
            mathbbE[hatbeta|X]= mathbbE[(X'X)^-1X'y|X] = (X'X)^-1X'(Xbeta + mathbbE[epsilon|X])= (X'X)^-1(X'X)beta = beta.
            $$



            $$
            frac1sigma ^ 2sum_i=1^n hatepsilon^2_i sim chi^2_n-p
            $$
            and
            $$
            rank(I-H)=n-p,
            $$



            hence,
            $$
            mathbbE[hatsigma^2|X] = fracsigma^2n-p mathbbE[ chi^2_n-p |X] = fracsigma^2(n-p)n-p = sigma^2.
            $$






            share|cite|improve this answer

























              up vote
              1
              down vote













              $$
              mathbbE[hatbeta|X]= mathbbE[(X'X)^-1X'y|X] = (X'X)^-1X'(Xbeta + mathbbE[epsilon|X])= (X'X)^-1(X'X)beta = beta.
              $$



              $$
              frac1sigma ^ 2sum_i=1^n hatepsilon^2_i sim chi^2_n-p
              $$
              and
              $$
              rank(I-H)=n-p,
              $$



              hence,
              $$
              mathbbE[hatsigma^2|X] = fracsigma^2n-p mathbbE[ chi^2_n-p |X] = fracsigma^2(n-p)n-p = sigma^2.
              $$






              share|cite|improve this answer























                up vote
                1
                down vote










                up vote
                1
                down vote









                $$
                mathbbE[hatbeta|X]= mathbbE[(X'X)^-1X'y|X] = (X'X)^-1X'(Xbeta + mathbbE[epsilon|X])= (X'X)^-1(X'X)beta = beta.
                $$



                $$
                frac1sigma ^ 2sum_i=1^n hatepsilon^2_i sim chi^2_n-p
                $$
                and
                $$
                rank(I-H)=n-p,
                $$



                hence,
                $$
                mathbbE[hatsigma^2|X] = fracsigma^2n-p mathbbE[ chi^2_n-p |X] = fracsigma^2(n-p)n-p = sigma^2.
                $$






                share|cite|improve this answer













                $$
                mathbbE[hatbeta|X]= mathbbE[(X'X)^-1X'y|X] = (X'X)^-1X'(Xbeta + mathbbE[epsilon|X])= (X'X)^-1(X'X)beta = beta.
                $$



                $$
                frac1sigma ^ 2sum_i=1^n hatepsilon^2_i sim chi^2_n-p
                $$
                and
                $$
                rank(I-H)=n-p,
                $$



                hence,
                $$
                mathbbE[hatsigma^2|X] = fracsigma^2n-p mathbbE[ chi^2_n-p |X] = fracsigma^2(n-p)n-p = sigma^2.
                $$







                share|cite|improve this answer













                share|cite|improve this answer



                share|cite|improve this answer











                answered Jul 12 at 15:20









                V. Vancak

                9,8202926




                9,8202926












                    Comments

                    Popular posts from this blog

                    What is the equation of a 3D cone with generalised tilt?

                    Color the edges and diagonals of a regular polygon

                    Relationship between determinant of matrix and determinant of adjoint?