How to obtain expressions for coefficients from OLS formula?
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
Consider the standard linear regression model: $y_i = alpha + beta D_i + e_i$ where the coefficients are defined by linear projections and $D_i$ is a dummy variable. In the population, the coefficients are given by:
$$alpha = E[y_i mid D_i =0] textand beta = E[y_i mid D_i = 1] - E[y_i mid D_i =0]$$
Using OLS to estimate the coefficients, we get:
$$widehatalpha = frac1sum_i=1^N1(D_i=0)sum_i=1^N1(D_i=0)y_i $$
$$widehatbeta = frac1sum_i=1^N1(D_i=1)sum_i=1^N1(D_i=1)y_i-frac1sum_i=1^N1(D_i=0)sum_i=1^N1(D_i=0)y_i $$
In other words, $widehatalpha$ is just the sample mean of $y_i$ in the subsample with $D_i=0$.
My question is, how can we arrive at the above coefficient estimates by using the standard OLS formulas? That is:
$$widehatalpha = overliney - overlineDwidehatbeta textand widehatbeta = fracsum_i=1^N(D_i - overlineD)(y_i - overliney)sum_i=1^N(D_i - overlineD)^2$$ where the bars represent sample means.
statistics regression least-squares linear-regression
add a comment |Â
up vote
1
down vote
favorite
Consider the standard linear regression model: $y_i = alpha + beta D_i + e_i$ where the coefficients are defined by linear projections and $D_i$ is a dummy variable. In the population, the coefficients are given by:
$$alpha = E[y_i mid D_i =0] textand beta = E[y_i mid D_i = 1] - E[y_i mid D_i =0]$$
Using OLS to estimate the coefficients, we get:
$$widehatalpha = frac1sum_i=1^N1(D_i=0)sum_i=1^N1(D_i=0)y_i $$
$$widehatbeta = frac1sum_i=1^N1(D_i=1)sum_i=1^N1(D_i=1)y_i-frac1sum_i=1^N1(D_i=0)sum_i=1^N1(D_i=0)y_i $$
In other words, $widehatalpha$ is just the sample mean of $y_i$ in the subsample with $D_i=0$.
My question is, how can we arrive at the above coefficient estimates by using the standard OLS formulas? That is:
$$widehatalpha = overliney - overlineDwidehatbeta textand widehatbeta = fracsum_i=1^N(D_i - overlineD)(y_i - overliney)sum_i=1^N(D_i - overlineD)^2$$ where the bars represent sample means.
statistics regression least-squares linear-regression
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
Consider the standard linear regression model: $y_i = alpha + beta D_i + e_i$ where the coefficients are defined by linear projections and $D_i$ is a dummy variable. In the population, the coefficients are given by:
$$alpha = E[y_i mid D_i =0] textand beta = E[y_i mid D_i = 1] - E[y_i mid D_i =0]$$
Using OLS to estimate the coefficients, we get:
$$widehatalpha = frac1sum_i=1^N1(D_i=0)sum_i=1^N1(D_i=0)y_i $$
$$widehatbeta = frac1sum_i=1^N1(D_i=1)sum_i=1^N1(D_i=1)y_i-frac1sum_i=1^N1(D_i=0)sum_i=1^N1(D_i=0)y_i $$
In other words, $widehatalpha$ is just the sample mean of $y_i$ in the subsample with $D_i=0$.
My question is, how can we arrive at the above coefficient estimates by using the standard OLS formulas? That is:
$$widehatalpha = overliney - overlineDwidehatbeta textand widehatbeta = fracsum_i=1^N(D_i - overlineD)(y_i - overliney)sum_i=1^N(D_i - overlineD)^2$$ where the bars represent sample means.
statistics regression least-squares linear-regression
Consider the standard linear regression model: $y_i = alpha + beta D_i + e_i$ where the coefficients are defined by linear projections and $D_i$ is a dummy variable. In the population, the coefficients are given by:
$$alpha = E[y_i mid D_i =0] textand beta = E[y_i mid D_i = 1] - E[y_i mid D_i =0]$$
Using OLS to estimate the coefficients, we get:
$$widehatalpha = frac1sum_i=1^N1(D_i=0)sum_i=1^N1(D_i=0)y_i $$
$$widehatbeta = frac1sum_i=1^N1(D_i=1)sum_i=1^N1(D_i=1)y_i-frac1sum_i=1^N1(D_i=0)sum_i=1^N1(D_i=0)y_i $$
In other words, $widehatalpha$ is just the sample mean of $y_i$ in the subsample with $D_i=0$.
My question is, how can we arrive at the above coefficient estimates by using the standard OLS formulas? That is:
$$widehatalpha = overliney - overlineDwidehatbeta textand widehatbeta = fracsum_i=1^N(D_i - overlineD)(y_i - overliney)sum_i=1^N(D_i - overlineD)^2$$ where the bars represent sample means.
statistics regression least-squares linear-regression
edited Jul 29 at 12:46
asked Jul 29 at 12:37
elbarto
1,519523
1,519523
add a comment |Â
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
1
down vote
accepted
Denote by $n_0$ the number of zeroes (of $D$) and by $n_1$ the number of ones, such that the total number of observation $n$ is $n_0 + n_1$. For $beta$ you can see here full derivation, that is compactly can be written as
$$
hatbeta = bary_1 - bary_0.
$$
For $alpha$ you can just plug in the result, namely,
beginalign
hatalpha &= bary_n - barDhatbeta\
&=fracn_!bary_1 + n_0bary_0 n_0 + n_1 - frac n_1 n_0 + n_1 ( bary_1 - bary_0) \
& = frac n_0 + n_1n_0 + n_1 bary_0 \
&=bary_0.
endalign
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
Denote by $n_0$ the number of zeroes (of $D$) and by $n_1$ the number of ones, such that the total number of observation $n$ is $n_0 + n_1$. For $beta$ you can see here full derivation, that is compactly can be written as
$$
hatbeta = bary_1 - bary_0.
$$
For $alpha$ you can just plug in the result, namely,
beginalign
hatalpha &= bary_n - barDhatbeta\
&=fracn_!bary_1 + n_0bary_0 n_0 + n_1 - frac n_1 n_0 + n_1 ( bary_1 - bary_0) \
& = frac n_0 + n_1n_0 + n_1 bary_0 \
&=bary_0.
endalign
add a comment |Â
up vote
1
down vote
accepted
Denote by $n_0$ the number of zeroes (of $D$) and by $n_1$ the number of ones, such that the total number of observation $n$ is $n_0 + n_1$. For $beta$ you can see here full derivation, that is compactly can be written as
$$
hatbeta = bary_1 - bary_0.
$$
For $alpha$ you can just plug in the result, namely,
beginalign
hatalpha &= bary_n - barDhatbeta\
&=fracn_!bary_1 + n_0bary_0 n_0 + n_1 - frac n_1 n_0 + n_1 ( bary_1 - bary_0) \
& = frac n_0 + n_1n_0 + n_1 bary_0 \
&=bary_0.
endalign
add a comment |Â
up vote
1
down vote
accepted
up vote
1
down vote
accepted
Denote by $n_0$ the number of zeroes (of $D$) and by $n_1$ the number of ones, such that the total number of observation $n$ is $n_0 + n_1$. For $beta$ you can see here full derivation, that is compactly can be written as
$$
hatbeta = bary_1 - bary_0.
$$
For $alpha$ you can just plug in the result, namely,
beginalign
hatalpha &= bary_n - barDhatbeta\
&=fracn_!bary_1 + n_0bary_0 n_0 + n_1 - frac n_1 n_0 + n_1 ( bary_1 - bary_0) \
& = frac n_0 + n_1n_0 + n_1 bary_0 \
&=bary_0.
endalign
Denote by $n_0$ the number of zeroes (of $D$) and by $n_1$ the number of ones, such that the total number of observation $n$ is $n_0 + n_1$. For $beta$ you can see here full derivation, that is compactly can be written as
$$
hatbeta = bary_1 - bary_0.
$$
For $alpha$ you can just plug in the result, namely,
beginalign
hatalpha &= bary_n - barDhatbeta\
&=fracn_!bary_1 + n_0bary_0 n_0 + n_1 - frac n_1 n_0 + n_1 ( bary_1 - bary_0) \
& = frac n_0 + n_1n_0 + n_1 bary_0 \
&=bary_0.
endalign
answered Aug 1 at 21:33
V. Vancak
9,7802926
9,7802926
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2866041%2fhow-to-obtain-expressions-for-coefficients-from-ols-formula%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password