Beta parameter estimation in least squares method by partial derivative

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












I was reading a particular document on linear regression and I just can't understand how the guy got to the estimation of the beta parameter. His results are as follows:



$SQ(alpha, beta)=sumlimits_i=1^nepsilon_i^2=sumlimits_i=1^n(y_i-alpha-beta x_i)^2$



$fracpartial SQ(alpha, beta)partial alphavert_alpha= hatalpha = 0 implies sumlimits_i=1^n(y_i - hatalpha - hatbeta x_i) = 0 $



$fracpartial SQ(alpha, beta)partial betavert_beta= hatbeta = 0 implies sumlimits_i=1^n x_i(y_i - hatalpha - hatbeta x_i) = 0$



$sumlimits_i=1^n y_i = n hatalpha + hatbeta sumlimits_i=1^n x_i implies hatalpha = bary - hatbetabarx$



$sumlimits_i=1^n x_i y_i = hatalpha sumlimits_i=1^n x_i + hatbeta sumlimits_i=1^n x_i^2 implies hatbeta = fracsumlimits_i=1^n x_i y_i - nbarxbarysumlimits_i=1^n x_i^2 - nbarx^2$



However, my result of the partial derivative with beta fixed is $hatbeta = fracsumlimits_i=1^nx_iy_i - barxbarysumlimits_i=1^nx_i^2-barx^2$



What is going on? What did I miss?







share|cite|improve this question

















  • 1




    Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
    – José Carlos Santos
    Jul 26 at 0:03










  • No prob. I'll edit the question!
    – Luís Muniz
    Jul 26 at 13:37














up vote
0
down vote

favorite












I was reading a particular document on linear regression and I just can't understand how the guy got to the estimation of the beta parameter. His results are as follows:



$SQ(alpha, beta)=sumlimits_i=1^nepsilon_i^2=sumlimits_i=1^n(y_i-alpha-beta x_i)^2$



$fracpartial SQ(alpha, beta)partial alphavert_alpha= hatalpha = 0 implies sumlimits_i=1^n(y_i - hatalpha - hatbeta x_i) = 0 $



$fracpartial SQ(alpha, beta)partial betavert_beta= hatbeta = 0 implies sumlimits_i=1^n x_i(y_i - hatalpha - hatbeta x_i) = 0$



$sumlimits_i=1^n y_i = n hatalpha + hatbeta sumlimits_i=1^n x_i implies hatalpha = bary - hatbetabarx$



$sumlimits_i=1^n x_i y_i = hatalpha sumlimits_i=1^n x_i + hatbeta sumlimits_i=1^n x_i^2 implies hatbeta = fracsumlimits_i=1^n x_i y_i - nbarxbarysumlimits_i=1^n x_i^2 - nbarx^2$



However, my result of the partial derivative with beta fixed is $hatbeta = fracsumlimits_i=1^nx_iy_i - barxbarysumlimits_i=1^nx_i^2-barx^2$



What is going on? What did I miss?







share|cite|improve this question

















  • 1




    Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
    – José Carlos Santos
    Jul 26 at 0:03










  • No prob. I'll edit the question!
    – Luís Muniz
    Jul 26 at 13:37












up vote
0
down vote

favorite









up vote
0
down vote

favorite











I was reading a particular document on linear regression and I just can't understand how the guy got to the estimation of the beta parameter. His results are as follows:



$SQ(alpha, beta)=sumlimits_i=1^nepsilon_i^2=sumlimits_i=1^n(y_i-alpha-beta x_i)^2$



$fracpartial SQ(alpha, beta)partial alphavert_alpha= hatalpha = 0 implies sumlimits_i=1^n(y_i - hatalpha - hatbeta x_i) = 0 $



$fracpartial SQ(alpha, beta)partial betavert_beta= hatbeta = 0 implies sumlimits_i=1^n x_i(y_i - hatalpha - hatbeta x_i) = 0$



$sumlimits_i=1^n y_i = n hatalpha + hatbeta sumlimits_i=1^n x_i implies hatalpha = bary - hatbetabarx$



$sumlimits_i=1^n x_i y_i = hatalpha sumlimits_i=1^n x_i + hatbeta sumlimits_i=1^n x_i^2 implies hatbeta = fracsumlimits_i=1^n x_i y_i - nbarxbarysumlimits_i=1^n x_i^2 - nbarx^2$



However, my result of the partial derivative with beta fixed is $hatbeta = fracsumlimits_i=1^nx_iy_i - barxbarysumlimits_i=1^nx_i^2-barx^2$



What is going on? What did I miss?







share|cite|improve this question













I was reading a particular document on linear regression and I just can't understand how the guy got to the estimation of the beta parameter. His results are as follows:



$SQ(alpha, beta)=sumlimits_i=1^nepsilon_i^2=sumlimits_i=1^n(y_i-alpha-beta x_i)^2$



$fracpartial SQ(alpha, beta)partial alphavert_alpha= hatalpha = 0 implies sumlimits_i=1^n(y_i - hatalpha - hatbeta x_i) = 0 $



$fracpartial SQ(alpha, beta)partial betavert_beta= hatbeta = 0 implies sumlimits_i=1^n x_i(y_i - hatalpha - hatbeta x_i) = 0$



$sumlimits_i=1^n y_i = n hatalpha + hatbeta sumlimits_i=1^n x_i implies hatalpha = bary - hatbetabarx$



$sumlimits_i=1^n x_i y_i = hatalpha sumlimits_i=1^n x_i + hatbeta sumlimits_i=1^n x_i^2 implies hatbeta = fracsumlimits_i=1^n x_i y_i - nbarxbarysumlimits_i=1^n x_i^2 - nbarx^2$



However, my result of the partial derivative with beta fixed is $hatbeta = fracsumlimits_i=1^nx_iy_i - barxbarysumlimits_i=1^nx_i^2-barx^2$



What is going on? What did I miss?









share|cite|improve this question












share|cite|improve this question




share|cite|improve this question








edited Jul 26 at 14:08
























asked Jul 25 at 23:55









Luís Muniz

1012




1012







  • 1




    Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
    – José Carlos Santos
    Jul 26 at 0:03










  • No prob. I'll edit the question!
    – Luís Muniz
    Jul 26 at 13:37












  • 1




    Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
    – José Carlos Santos
    Jul 26 at 0:03










  • No prob. I'll edit the question!
    – Luís Muniz
    Jul 26 at 13:37







1




1




Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
– José Carlos Santos
Jul 26 at 0:03




Welcome to MSE. It is in your best interest that you type your questions (using MathJax) instead of posting links to pictures.
– José Carlos Santos
Jul 26 at 0:03












No prob. I'll edit the question!
– Luís Muniz
Jul 26 at 13:37




No prob. I'll edit the question!
– Luís Muniz
Jul 26 at 13:37










1 Answer
1






active

oldest

votes

















up vote
0
down vote













It might be due to missing parentheses in your attempt: $$sum_i=1^n (x_i^2 - barx^2) = left(sum_i=1^n x_i^2right) - n barx^2.$$




By substituting $hatalpha = bary - hatbeta barx$ and $sum_i x_i = n barx$ into the last equation we have
beginalign
sum_i x_i y_i &= (bary - hatbeta barx) sum_i x_i + hatbeta sum_i x_i^2
\
sum_i x_i y_i &= n barxbary + hatbeta ((sum_i x_i^2) - n barx^2)
endalign






share|cite|improve this answer























  • Man, it really looks like this is the problem. Thank's for calling my attention to this. However, still when I do the math, originally the summation including $x_i^2$ does not include $barx^2$. It is $(sumlimits_i=1^n x_i^2) - barx^2$. Am I missing something here?
    – Luís Muniz
    Jul 26 at 14:00











Your Answer




StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);








 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2862946%2fbeta-parameter-estimation-in-least-squares-method-by-partial-derivative%23new-answer', 'question_page');

);

Post as a guest






























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
0
down vote













It might be due to missing parentheses in your attempt: $$sum_i=1^n (x_i^2 - barx^2) = left(sum_i=1^n x_i^2right) - n barx^2.$$




By substituting $hatalpha = bary - hatbeta barx$ and $sum_i x_i = n barx$ into the last equation we have
beginalign
sum_i x_i y_i &= (bary - hatbeta barx) sum_i x_i + hatbeta sum_i x_i^2
\
sum_i x_i y_i &= n barxbary + hatbeta ((sum_i x_i^2) - n barx^2)
endalign






share|cite|improve this answer























  • Man, it really looks like this is the problem. Thank's for calling my attention to this. However, still when I do the math, originally the summation including $x_i^2$ does not include $barx^2$. It is $(sumlimits_i=1^n x_i^2) - barx^2$. Am I missing something here?
    – Luís Muniz
    Jul 26 at 14:00















up vote
0
down vote













It might be due to missing parentheses in your attempt: $$sum_i=1^n (x_i^2 - barx^2) = left(sum_i=1^n x_i^2right) - n barx^2.$$




By substituting $hatalpha = bary - hatbeta barx$ and $sum_i x_i = n barx$ into the last equation we have
beginalign
sum_i x_i y_i &= (bary - hatbeta barx) sum_i x_i + hatbeta sum_i x_i^2
\
sum_i x_i y_i &= n barxbary + hatbeta ((sum_i x_i^2) - n barx^2)
endalign






share|cite|improve this answer























  • Man, it really looks like this is the problem. Thank's for calling my attention to this. However, still when I do the math, originally the summation including $x_i^2$ does not include $barx^2$. It is $(sumlimits_i=1^n x_i^2) - barx^2$. Am I missing something here?
    – Luís Muniz
    Jul 26 at 14:00













up vote
0
down vote










up vote
0
down vote









It might be due to missing parentheses in your attempt: $$sum_i=1^n (x_i^2 - barx^2) = left(sum_i=1^n x_i^2right) - n barx^2.$$




By substituting $hatalpha = bary - hatbeta barx$ and $sum_i x_i = n barx$ into the last equation we have
beginalign
sum_i x_i y_i &= (bary - hatbeta barx) sum_i x_i + hatbeta sum_i x_i^2
\
sum_i x_i y_i &= n barxbary + hatbeta ((sum_i x_i^2) - n barx^2)
endalign






share|cite|improve this answer















It might be due to missing parentheses in your attempt: $$sum_i=1^n (x_i^2 - barx^2) = left(sum_i=1^n x_i^2right) - n barx^2.$$




By substituting $hatalpha = bary - hatbeta barx$ and $sum_i x_i = n barx$ into the last equation we have
beginalign
sum_i x_i y_i &= (bary - hatbeta barx) sum_i x_i + hatbeta sum_i x_i^2
\
sum_i x_i y_i &= n barxbary + hatbeta ((sum_i x_i^2) - n barx^2)
endalign







share|cite|improve this answer















share|cite|improve this answer



share|cite|improve this answer








edited Jul 26 at 16:37


























answered Jul 25 at 23:59









angryavian

34.6k12874




34.6k12874











  • Man, it really looks like this is the problem. Thank's for calling my attention to this. However, still when I do the math, originally the summation including $x_i^2$ does not include $barx^2$. It is $(sumlimits_i=1^n x_i^2) - barx^2$. Am I missing something here?
    – Luís Muniz
    Jul 26 at 14:00

















  • Man, it really looks like this is the problem. Thank's for calling my attention to this. However, still when I do the math, originally the summation including $x_i^2$ does not include $barx^2$. It is $(sumlimits_i=1^n x_i^2) - barx^2$. Am I missing something here?
    – Luís Muniz
    Jul 26 at 14:00
















Man, it really looks like this is the problem. Thank's for calling my attention to this. However, still when I do the math, originally the summation including $x_i^2$ does not include $barx^2$. It is $(sumlimits_i=1^n x_i^2) - barx^2$. Am I missing something here?
– Luís Muniz
Jul 26 at 14:00





Man, it really looks like this is the problem. Thank's for calling my attention to this. However, still when I do the math, originally the summation including $x_i^2$ does not include $barx^2$. It is $(sumlimits_i=1^n x_i^2) - barx^2$. Am I missing something here?
– Luís Muniz
Jul 26 at 14:00













 

draft saved


draft discarded


























 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2862946%2fbeta-parameter-estimation-in-least-squares-method-by-partial-derivative%23new-answer', 'question_page');

);

Post as a guest













































































Comments

Popular posts from this blog

What is the equation of a 3D cone with generalised tilt?

Color the edges and diagonals of a regular polygon

Relationship between determinant of matrix and determinant of adjoint?