Spanning Vectors - proof [duplicate]
Clash Royale CLAN TAG#URR8PPP
up vote
-1
down vote
favorite
This question already has an answer here:
Basis for a subspace Proof
3 answers
Prove that: If v and w span V, then v+w and v-w also span V.
my thinking behind this is that since the two vectors are linear combinations of v and w that is why they span V, but am unsure how to properly go about it.
linear-algebra
marked as duplicate by Arthur, Michael Hoppe, John Ma, Lord Shark the Unknown, Simply Beautiful Art Aug 1 at 14:52
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
add a comment |Â
up vote
-1
down vote
favorite
This question already has an answer here:
Basis for a subspace Proof
3 answers
Prove that: If v and w span V, then v+w and v-w also span V.
my thinking behind this is that since the two vectors are linear combinations of v and w that is why they span V, but am unsure how to properly go about it.
linear-algebra
marked as duplicate by Arthur, Michael Hoppe, John Ma, Lord Shark the Unknown, Simply Beautiful Art Aug 1 at 14:52
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
math.stackexchange.com/questions/2866751/…
– Fred
Jul 30 at 9:22
1
In general, linear combinations of spanning vectors won't span the vector space. For example, $e_1=beginpmatrix1\0endpmatrix$ and $e_2=beginpmatrix0\1endpmatrix$ span $mathbbR^2$, but $e_1 + e_2$ and $2e_1 + 2e_2$ only span a $1$-dimensional subspace.
– Babelfish
Jul 30 at 9:32
add a comment |Â
up vote
-1
down vote
favorite
up vote
-1
down vote
favorite
This question already has an answer here:
Basis for a subspace Proof
3 answers
Prove that: If v and w span V, then v+w and v-w also span V.
my thinking behind this is that since the two vectors are linear combinations of v and w that is why they span V, but am unsure how to properly go about it.
linear-algebra
This question already has an answer here:
Basis for a subspace Proof
3 answers
Prove that: If v and w span V, then v+w and v-w also span V.
my thinking behind this is that since the two vectors are linear combinations of v and w that is why they span V, but am unsure how to properly go about it.
This question already has an answer here:
Basis for a subspace Proof
3 answers
linear-algebra
asked Jul 30 at 9:21
J-Dorman
555
555
marked as duplicate by Arthur, Michael Hoppe, John Ma, Lord Shark the Unknown, Simply Beautiful Art Aug 1 at 14:52
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
marked as duplicate by Arthur, Michael Hoppe, John Ma, Lord Shark the Unknown, Simply Beautiful Art Aug 1 at 14:52
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
math.stackexchange.com/questions/2866751/…
– Fred
Jul 30 at 9:22
1
In general, linear combinations of spanning vectors won't span the vector space. For example, $e_1=beginpmatrix1\0endpmatrix$ and $e_2=beginpmatrix0\1endpmatrix$ span $mathbbR^2$, but $e_1 + e_2$ and $2e_1 + 2e_2$ only span a $1$-dimensional subspace.
– Babelfish
Jul 30 at 9:32
add a comment |Â
math.stackexchange.com/questions/2866751/…
– Fred
Jul 30 at 9:22
1
In general, linear combinations of spanning vectors won't span the vector space. For example, $e_1=beginpmatrix1\0endpmatrix$ and $e_2=beginpmatrix0\1endpmatrix$ span $mathbbR^2$, but $e_1 + e_2$ and $2e_1 + 2e_2$ only span a $1$-dimensional subspace.
– Babelfish
Jul 30 at 9:32
math.stackexchange.com/questions/2866751/…
– Fred
Jul 30 at 9:22
math.stackexchange.com/questions/2866751/…
– Fred
Jul 30 at 9:22
1
1
In general, linear combinations of spanning vectors won't span the vector space. For example, $e_1=beginpmatrix1\0endpmatrix$ and $e_2=beginpmatrix0\1endpmatrix$ span $mathbbR^2$, but $e_1 + e_2$ and $2e_1 + 2e_2$ only span a $1$-dimensional subspace.
– Babelfish
Jul 30 at 9:32
In general, linear combinations of spanning vectors won't span the vector space. For example, $e_1=beginpmatrix1\0endpmatrix$ and $e_2=beginpmatrix0\1endpmatrix$ span $mathbbR^2$, but $e_1 + e_2$ and $2e_1 + 2e_2$ only span a $1$-dimensional subspace.
– Babelfish
Jul 30 at 9:32
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
3
down vote
Consider any vector in the span
$$u=av+bw$$
then
$$u=c(v+w)+d(v-w)=(c+d)v+(c-d)w$$
then choose
- $c+d=a$
- $c-d=b$
that is
- $c=fraca+b2$
- $d=fraca-b2$
$$u=av+bw=fraca+b2(v+w)+fraca-b2(v-w)$$
and therefore also $v+w$ and $v-w$ span $V$.
add a comment |Â
up vote
3
down vote
Note that $langle v+w,v-wranglesubseteqlangle v,wrangle$, bwcause any linear combination of $v+w$ and $v-w$ is obviously a linear combination of $v$ and $w$.
On the other hand,
$$
v=frac12(v+w)+frac12(v-w)
qquad
w=frac12(v+w)-frac12(v-w)
$$
For the same reason as before, $langle v,wranglesubseteqlangle v+w,v-wrangle$.
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
3
down vote
Consider any vector in the span
$$u=av+bw$$
then
$$u=c(v+w)+d(v-w)=(c+d)v+(c-d)w$$
then choose
- $c+d=a$
- $c-d=b$
that is
- $c=fraca+b2$
- $d=fraca-b2$
$$u=av+bw=fraca+b2(v+w)+fraca-b2(v-w)$$
and therefore also $v+w$ and $v-w$ span $V$.
add a comment |Â
up vote
3
down vote
Consider any vector in the span
$$u=av+bw$$
then
$$u=c(v+w)+d(v-w)=(c+d)v+(c-d)w$$
then choose
- $c+d=a$
- $c-d=b$
that is
- $c=fraca+b2$
- $d=fraca-b2$
$$u=av+bw=fraca+b2(v+w)+fraca-b2(v-w)$$
and therefore also $v+w$ and $v-w$ span $V$.
add a comment |Â
up vote
3
down vote
up vote
3
down vote
Consider any vector in the span
$$u=av+bw$$
then
$$u=c(v+w)+d(v-w)=(c+d)v+(c-d)w$$
then choose
- $c+d=a$
- $c-d=b$
that is
- $c=fraca+b2$
- $d=fraca-b2$
$$u=av+bw=fraca+b2(v+w)+fraca-b2(v-w)$$
and therefore also $v+w$ and $v-w$ span $V$.
Consider any vector in the span
$$u=av+bw$$
then
$$u=c(v+w)+d(v-w)=(c+d)v+(c-d)w$$
then choose
- $c+d=a$
- $c-d=b$
that is
- $c=fraca+b2$
- $d=fraca-b2$
$$u=av+bw=fraca+b2(v+w)+fraca-b2(v-w)$$
and therefore also $v+w$ and $v-w$ span $V$.
answered Jul 30 at 10:08
gimusi
64.5k73482
64.5k73482
add a comment |Â
add a comment |Â
up vote
3
down vote
Note that $langle v+w,v-wranglesubseteqlangle v,wrangle$, bwcause any linear combination of $v+w$ and $v-w$ is obviously a linear combination of $v$ and $w$.
On the other hand,
$$
v=frac12(v+w)+frac12(v-w)
qquad
w=frac12(v+w)-frac12(v-w)
$$
For the same reason as before, $langle v,wranglesubseteqlangle v+w,v-wrangle$.
add a comment |Â
up vote
3
down vote
Note that $langle v+w,v-wranglesubseteqlangle v,wrangle$, bwcause any linear combination of $v+w$ and $v-w$ is obviously a linear combination of $v$ and $w$.
On the other hand,
$$
v=frac12(v+w)+frac12(v-w)
qquad
w=frac12(v+w)-frac12(v-w)
$$
For the same reason as before, $langle v,wranglesubseteqlangle v+w,v-wrangle$.
add a comment |Â
up vote
3
down vote
up vote
3
down vote
Note that $langle v+w,v-wranglesubseteqlangle v,wrangle$, bwcause any linear combination of $v+w$ and $v-w$ is obviously a linear combination of $v$ and $w$.
On the other hand,
$$
v=frac12(v+w)+frac12(v-w)
qquad
w=frac12(v+w)-frac12(v-w)
$$
For the same reason as before, $langle v,wranglesubseteqlangle v+w,v-wrangle$.
Note that $langle v+w,v-wranglesubseteqlangle v,wrangle$, bwcause any linear combination of $v+w$ and $v-w$ is obviously a linear combination of $v$ and $w$.
On the other hand,
$$
v=frac12(v+w)+frac12(v-w)
qquad
w=frac12(v+w)-frac12(v-w)
$$
For the same reason as before, $langle v,wranglesubseteqlangle v+w,v-wrangle$.
answered Jul 30 at 11:37


egreg
164k1180187
164k1180187
add a comment |Â
add a comment |Â
math.stackexchange.com/questions/2866751/…
– Fred
Jul 30 at 9:22
1
In general, linear combinations of spanning vectors won't span the vector space. For example, $e_1=beginpmatrix1\0endpmatrix$ and $e_2=beginpmatrix0\1endpmatrix$ span $mathbbR^2$, but $e_1 + e_2$ and $2e_1 + 2e_2$ only span a $1$-dimensional subspace.
– Babelfish
Jul 30 at 9:32