Beta parameter estimation in least squares method by partial derivative
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
I was reading a particular document on linear regression and I just can't understand how the guy got to the estimation of the beta parameter. His results are as follows:
$SQ(alpha, beta)=sumlimits_i=1^nepsilon_i^2=sumlimits_i=1^n(y_i-alpha-beta x_i)^2$
$fracpartial SQ(alpha, beta)partial alphavert_alpha= hatalpha = 0 implies sumlimits_i=1^n(y_i - hatalpha - hatbeta x_i) = 0 $
$fracpartial SQ(alpha, beta)partial betavert_beta= hatbeta = 0 implies sumlimits_i=1^n x_i(y_i - hatalpha - hatbeta x_i) = 0$
$sumlimits_i=1^n y_i = n hatalpha + hatbeta sumlimits_i=1^n x_i implies hatalpha = bary - hatbetabarx$
$sumlimits_i=1^n x_i y_i = hatalpha sumlimits_i=1^n x_i + hatbeta sumlimits_i=1^n x_i^2 implies hatbeta = fracsumlimits_i=1^n x_i y_i - nbarxbarysumlimits_i=1^n x_i^2 - nbarx^2$
However, my result of the partial derivative with beta fixed is $hatbeta = fracsumlimits_i=1^nx_iy_i - barxbarysumlimits_i=1^nx_i^2-barx^2$
What is going on? What did I miss?
least-squares linear-regression
add a comment |Â
up vote
0
down vote
favorite
I was reading a particular document on linear regression and I just can't understand how the guy got to the estimation of the beta parameter. His results are as follows:
$SQ(alpha, beta)=sumlimits_i=1^nepsilon_i^2=sumlimits_i=1^n(y_i-alpha-beta x_i)^2$
$fracpartial SQ(alpha, beta)partial alphavert_alpha= hatalpha = 0 implies sumlimits_i=1^n(y_i - hatalpha - hatbeta x_i) = 0 $
$fracpartial SQ(alpha, beta)partial betavert_beta= hatbeta = 0 implies sumlimits_i=1^n x_i(y_i - hatalpha - hatbeta x_i) = 0$
$sumlimits_i=1^n y_i = n hatalpha + hatbeta sumlimits_i=1^n x_i implies hatalpha = bary - hatbetabarx$
$sumlimits_i=1^n x_i y_i = hatalpha sumlimits_i=1^n x_i + hatbeta sumlimits_i=1^n x_i^2 implies hatbeta = fracsumlimits_i=1^n x_i y_i - nbarxbarysumlimits_i=1^n x_i^2 - nbarx^2$
However, my result of the partial derivative with beta fixed is $hatbeta = fracsumlimits_i=1^nx_iy_i - barxbarysumlimits_i=1^nx_i^2-barx^2$
What is going on? What did I miss?
least-squares linear-regression
1
Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
– José Carlos Santos
Jul 26 at 0:03
No prob. I'll edit the question!
– LuÃs Muniz
Jul 26 at 13:37
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I was reading a particular document on linear regression and I just can't understand how the guy got to the estimation of the beta parameter. His results are as follows:
$SQ(alpha, beta)=sumlimits_i=1^nepsilon_i^2=sumlimits_i=1^n(y_i-alpha-beta x_i)^2$
$fracpartial SQ(alpha, beta)partial alphavert_alpha= hatalpha = 0 implies sumlimits_i=1^n(y_i - hatalpha - hatbeta x_i) = 0 $
$fracpartial SQ(alpha, beta)partial betavert_beta= hatbeta = 0 implies sumlimits_i=1^n x_i(y_i - hatalpha - hatbeta x_i) = 0$
$sumlimits_i=1^n y_i = n hatalpha + hatbeta sumlimits_i=1^n x_i implies hatalpha = bary - hatbetabarx$
$sumlimits_i=1^n x_i y_i = hatalpha sumlimits_i=1^n x_i + hatbeta sumlimits_i=1^n x_i^2 implies hatbeta = fracsumlimits_i=1^n x_i y_i - nbarxbarysumlimits_i=1^n x_i^2 - nbarx^2$
However, my result of the partial derivative with beta fixed is $hatbeta = fracsumlimits_i=1^nx_iy_i - barxbarysumlimits_i=1^nx_i^2-barx^2$
What is going on? What did I miss?
least-squares linear-regression
I was reading a particular document on linear regression and I just can't understand how the guy got to the estimation of the beta parameter. His results are as follows:
$SQ(alpha, beta)=sumlimits_i=1^nepsilon_i^2=sumlimits_i=1^n(y_i-alpha-beta x_i)^2$
$fracpartial SQ(alpha, beta)partial alphavert_alpha= hatalpha = 0 implies sumlimits_i=1^n(y_i - hatalpha - hatbeta x_i) = 0 $
$fracpartial SQ(alpha, beta)partial betavert_beta= hatbeta = 0 implies sumlimits_i=1^n x_i(y_i - hatalpha - hatbeta x_i) = 0$
$sumlimits_i=1^n y_i = n hatalpha + hatbeta sumlimits_i=1^n x_i implies hatalpha = bary - hatbetabarx$
$sumlimits_i=1^n x_i y_i = hatalpha sumlimits_i=1^n x_i + hatbeta sumlimits_i=1^n x_i^2 implies hatbeta = fracsumlimits_i=1^n x_i y_i - nbarxbarysumlimits_i=1^n x_i^2 - nbarx^2$
However, my result of the partial derivative with beta fixed is $hatbeta = fracsumlimits_i=1^nx_iy_i - barxbarysumlimits_i=1^nx_i^2-barx^2$
What is going on? What did I miss?
least-squares linear-regression
edited Jul 26 at 14:08
asked Jul 25 at 23:55


LuÃs Muniz
1012
1012
1
Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
– José Carlos Santos
Jul 26 at 0:03
No prob. I'll edit the question!
– LuÃs Muniz
Jul 26 at 13:37
add a comment |Â
1
Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
– José Carlos Santos
Jul 26 at 0:03
No prob. I'll edit the question!
– LuÃs Muniz
Jul 26 at 13:37
1
1
Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
– José Carlos Santos
Jul 26 at 0:03
Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
– José Carlos Santos
Jul 26 at 0:03
No prob. I'll edit the question!
– LuÃs Muniz
Jul 26 at 13:37
No prob. I'll edit the question!
– LuÃs Muniz
Jul 26 at 13:37
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
0
down vote
It might be due to missing parentheses in your attempt: $$sum_i=1^n (x_i^2 - barx^2) = left(sum_i=1^n x_i^2right) - n barx^2.$$
By substituting $hatalpha = bary - hatbeta barx$ and $sum_i x_i = n barx$ into the last equation we have
beginalign
sum_i x_i y_i &= (bary - hatbeta barx) sum_i x_i + hatbeta sum_i x_i^2
\
sum_i x_i y_i &= n barxbary + hatbeta ((sum_i x_i^2) - n barx^2)
endalign
Man, it really looks like this is the problem. Thank's for calling my attention to this. However, still when I do the math, originally the summation including $x_i^2$ does not include $barx^2$. It is $(sumlimits_i=1^n x_i^2) - barx^2$. Am I missing something here?
– LuÃs Muniz
Jul 26 at 14:00
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
It might be due to missing parentheses in your attempt: $$sum_i=1^n (x_i^2 - barx^2) = left(sum_i=1^n x_i^2right) - n barx^2.$$
By substituting $hatalpha = bary - hatbeta barx$ and $sum_i x_i = n barx$ into the last equation we have
beginalign
sum_i x_i y_i &= (bary - hatbeta barx) sum_i x_i + hatbeta sum_i x_i^2
\
sum_i x_i y_i &= n barxbary + hatbeta ((sum_i x_i^2) - n barx^2)
endalign
Man, it really looks like this is the problem. Thank's for calling my attention to this. However, still when I do the math, originally the summation including $x_i^2$ does not include $barx^2$. It is $(sumlimits_i=1^n x_i^2) - barx^2$. Am I missing something here?
– LuÃs Muniz
Jul 26 at 14:00
add a comment |Â
up vote
0
down vote
It might be due to missing parentheses in your attempt: $$sum_i=1^n (x_i^2 - barx^2) = left(sum_i=1^n x_i^2right) - n barx^2.$$
By substituting $hatalpha = bary - hatbeta barx$ and $sum_i x_i = n barx$ into the last equation we have
beginalign
sum_i x_i y_i &= (bary - hatbeta barx) sum_i x_i + hatbeta sum_i x_i^2
\
sum_i x_i y_i &= n barxbary + hatbeta ((sum_i x_i^2) - n barx^2)
endalign
Man, it really looks like this is the problem. Thank's for calling my attention to this. However, still when I do the math, originally the summation including $x_i^2$ does not include $barx^2$. It is $(sumlimits_i=1^n x_i^2) - barx^2$. Am I missing something here?
– LuÃs Muniz
Jul 26 at 14:00
add a comment |Â
up vote
0
down vote
up vote
0
down vote
It might be due to missing parentheses in your attempt: $$sum_i=1^n (x_i^2 - barx^2) = left(sum_i=1^n x_i^2right) - n barx^2.$$
By substituting $hatalpha = bary - hatbeta barx$ and $sum_i x_i = n barx$ into the last equation we have
beginalign
sum_i x_i y_i &= (bary - hatbeta barx) sum_i x_i + hatbeta sum_i x_i^2
\
sum_i x_i y_i &= n barxbary + hatbeta ((sum_i x_i^2) - n barx^2)
endalign
It might be due to missing parentheses in your attempt: $$sum_i=1^n (x_i^2 - barx^2) = left(sum_i=1^n x_i^2right) - n barx^2.$$
By substituting $hatalpha = bary - hatbeta barx$ and $sum_i x_i = n barx$ into the last equation we have
beginalign
sum_i x_i y_i &= (bary - hatbeta barx) sum_i x_i + hatbeta sum_i x_i^2
\
sum_i x_i y_i &= n barxbary + hatbeta ((sum_i x_i^2) - n barx^2)
endalign
edited Jul 26 at 16:37
answered Jul 25 at 23:59
angryavian
34.6k12874
34.6k12874
Man, it really looks like this is the problem. Thank's for calling my attention to this. However, still when I do the math, originally the summation including $x_i^2$ does not include $barx^2$. It is $(sumlimits_i=1^n x_i^2) - barx^2$. Am I missing something here?
– LuÃs Muniz
Jul 26 at 14:00
add a comment |Â
Man, it really looks like this is the problem. Thank's for calling my attention to this. However, still when I do the math, originally the summation including $x_i^2$ does not include $barx^2$. It is $(sumlimits_i=1^n x_i^2) - barx^2$. Am I missing something here?
– LuÃs Muniz
Jul 26 at 14:00
Man, it really looks like this is the problem. Thank's for calling my attention to this. However, still when I do the math, originally the summation including $x_i^2$ does not include $barx^2$. It is $(sumlimits_i=1^n x_i^2) - barx^2$. Am I missing something here?
– LuÃs Muniz
Jul 26 at 14:00
Man, it really looks like this is the problem. Thank's for calling my attention to this. However, still when I do the math, originally the summation including $x_i^2$ does not include $barx^2$. It is $(sumlimits_i=1^n x_i^2) - barx^2$. Am I missing something here?
– LuÃs Muniz
Jul 26 at 14:00
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2862946%2fbeta-parameter-estimation-in-least-squares-method-by-partial-derivative%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
– José Carlos Santos
Jul 26 at 0:03
No prob. I'll edit the question!
– LuÃs Muniz
Jul 26 at 13:37