Negation of the Definition of Linear Independence
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
$textbfDefinition.$
Let $V$ be a vector space, and let $textbfv_1,dots,textbfv_n in V$. Let $alpha_1,dots,alpha_n$ be scalars. Let $textbf0$ be the zero element of $V$.
$textbfv_1,dots,textbfv_n$ are said to be linearly independent if $$alpha_1 textbfv_1 + dots + alpha_n textbfv_n = textbf0 Leftrightarrow alpha_1,dots,alpha_n = 0$$
Firstly, is this a valid definition of linear independence?
Secondly, how do I find the negation of this definition of linear independence? I would expect to get something like there exist scalars $alpha_1,dots,alpha_n$, not all zero, such that $alpha_1 textbfv_1 + dots + alpha_n textbfv_n = textbf0$, but I am not sure how I would arrive at something like this.
linear-algebra propositional-calculus
add a comment |Â
up vote
0
down vote
favorite
$textbfDefinition.$
Let $V$ be a vector space, and let $textbfv_1,dots,textbfv_n in V$. Let $alpha_1,dots,alpha_n$ be scalars. Let $textbf0$ be the zero element of $V$.
$textbfv_1,dots,textbfv_n$ are said to be linearly independent if $$alpha_1 textbfv_1 + dots + alpha_n textbfv_n = textbf0 Leftrightarrow alpha_1,dots,alpha_n = 0$$
Firstly, is this a valid definition of linear independence?
Secondly, how do I find the negation of this definition of linear independence? I would expect to get something like there exist scalars $alpha_1,dots,alpha_n$, not all zero, such that $alpha_1 textbfv_1 + dots + alpha_n textbfv_n = textbf0$, but I am not sure how I would arrive at something like this.
linear-algebra propositional-calculus
1
You are right about the negation.
– xbh
Jul 31 at 5:19
@xbh Yes, I suspected so, but I would like to know how to (i.e go through some explicit steps) to negate the definition to arrive at something equivalent to what I wrote at the end.
– Leekboi
Jul 31 at 5:21
Reverse the quantifiers, i.e. "for all" to "exist one", "exist one" to "for all". Also take the negation of the statement.
– xbh
Jul 31 at 5:23
Example. The definition of linear independence could be rewritten as "if $sum alpha_j boldsymbol v_j = mathbf 0, $then all $alpha_j = 0$" [the other direction holds always under the context of vector space]. Now take the negation of the conclusion above: "all" becomes "some", "$alpha_j = 0$" becomes "$alpha_j neq 0$". Combine them together yields "some $alpha_j $can be nonzero despite that equation holds".
– xbh
Jul 31 at 5:30
1
@J.G. Thanks for pointing out. I have noticed these flaws.
– xbh
Jul 31 at 5:34
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
$textbfDefinition.$
Let $V$ be a vector space, and let $textbfv_1,dots,textbfv_n in V$. Let $alpha_1,dots,alpha_n$ be scalars. Let $textbf0$ be the zero element of $V$.
$textbfv_1,dots,textbfv_n$ are said to be linearly independent if $$alpha_1 textbfv_1 + dots + alpha_n textbfv_n = textbf0 Leftrightarrow alpha_1,dots,alpha_n = 0$$
Firstly, is this a valid definition of linear independence?
Secondly, how do I find the negation of this definition of linear independence? I would expect to get something like there exist scalars $alpha_1,dots,alpha_n$, not all zero, such that $alpha_1 textbfv_1 + dots + alpha_n textbfv_n = textbf0$, but I am not sure how I would arrive at something like this.
linear-algebra propositional-calculus
$textbfDefinition.$
Let $V$ be a vector space, and let $textbfv_1,dots,textbfv_n in V$. Let $alpha_1,dots,alpha_n$ be scalars. Let $textbf0$ be the zero element of $V$.
$textbfv_1,dots,textbfv_n$ are said to be linearly independent if $$alpha_1 textbfv_1 + dots + alpha_n textbfv_n = textbf0 Leftrightarrow alpha_1,dots,alpha_n = 0$$
Firstly, is this a valid definition of linear independence?
Secondly, how do I find the negation of this definition of linear independence? I would expect to get something like there exist scalars $alpha_1,dots,alpha_n$, not all zero, such that $alpha_1 textbfv_1 + dots + alpha_n textbfv_n = textbf0$, but I am not sure how I would arrive at something like this.
linear-algebra propositional-calculus
asked Jul 31 at 5:14


Leekboi
392312
392312
1
You are right about the negation.
– xbh
Jul 31 at 5:19
@xbh Yes, I suspected so, but I would like to know how to (i.e go through some explicit steps) to negate the definition to arrive at something equivalent to what I wrote at the end.
– Leekboi
Jul 31 at 5:21
Reverse the quantifiers, i.e. "for all" to "exist one", "exist one" to "for all". Also take the negation of the statement.
– xbh
Jul 31 at 5:23
Example. The definition of linear independence could be rewritten as "if $sum alpha_j boldsymbol v_j = mathbf 0, $then all $alpha_j = 0$" [the other direction holds always under the context of vector space]. Now take the negation of the conclusion above: "all" becomes "some", "$alpha_j = 0$" becomes "$alpha_j neq 0$". Combine them together yields "some $alpha_j $can be nonzero despite that equation holds".
– xbh
Jul 31 at 5:30
1
@J.G. Thanks for pointing out. I have noticed these flaws.
– xbh
Jul 31 at 5:34
add a comment |Â
1
You are right about the negation.
– xbh
Jul 31 at 5:19
@xbh Yes, I suspected so, but I would like to know how to (i.e go through some explicit steps) to negate the definition to arrive at something equivalent to what I wrote at the end.
– Leekboi
Jul 31 at 5:21
Reverse the quantifiers, i.e. "for all" to "exist one", "exist one" to "for all". Also take the negation of the statement.
– xbh
Jul 31 at 5:23
Example. The definition of linear independence could be rewritten as "if $sum alpha_j boldsymbol v_j = mathbf 0, $then all $alpha_j = 0$" [the other direction holds always under the context of vector space]. Now take the negation of the conclusion above: "all" becomes "some", "$alpha_j = 0$" becomes "$alpha_j neq 0$". Combine them together yields "some $alpha_j $can be nonzero despite that equation holds".
– xbh
Jul 31 at 5:30
1
@J.G. Thanks for pointing out. I have noticed these flaws.
– xbh
Jul 31 at 5:34
1
1
You are right about the negation.
– xbh
Jul 31 at 5:19
You are right about the negation.
– xbh
Jul 31 at 5:19
@xbh Yes, I suspected so, but I would like to know how to (i.e go through some explicit steps) to negate the definition to arrive at something equivalent to what I wrote at the end.
– Leekboi
Jul 31 at 5:21
@xbh Yes, I suspected so, but I would like to know how to (i.e go through some explicit steps) to negate the definition to arrive at something equivalent to what I wrote at the end.
– Leekboi
Jul 31 at 5:21
Reverse the quantifiers, i.e. "for all" to "exist one", "exist one" to "for all". Also take the negation of the statement.
– xbh
Jul 31 at 5:23
Reverse the quantifiers, i.e. "for all" to "exist one", "exist one" to "for all". Also take the negation of the statement.
– xbh
Jul 31 at 5:23
Example. The definition of linear independence could be rewritten as "if $sum alpha_j boldsymbol v_j = mathbf 0, $then all $alpha_j = 0$" [the other direction holds always under the context of vector space]. Now take the negation of the conclusion above: "all" becomes "some", "$alpha_j = 0$" becomes "$alpha_j neq 0$". Combine them together yields "some $alpha_j $can be nonzero despite that equation holds".
– xbh
Jul 31 at 5:30
Example. The definition of linear independence could be rewritten as "if $sum alpha_j boldsymbol v_j = mathbf 0, $then all $alpha_j = 0$" [the other direction holds always under the context of vector space]. Now take the negation of the conclusion above: "all" becomes "some", "$alpha_j = 0$" becomes "$alpha_j neq 0$". Combine them together yields "some $alpha_j $can be nonzero despite that equation holds".
– xbh
Jul 31 at 5:30
1
1
@J.G. Thanks for pointing out. I have noticed these flaws.
– xbh
Jul 31 at 5:34
@J.G. Thanks for pointing out. I have noticed these flaws.
– xbh
Jul 31 at 5:34
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
1
down vote
accepted
Your definition of linear dependence is valid, although we usually only write $implies$ since the left-hand arrow is trivial. Of course, to say we can deduce all $alpha_i$ is equivalent to saying there does not exist any other choice of the $alpha_i$ that works. Therefore, the negation is as expected to say that one does.
1
I guess this is pretty obvious, thank you. I was wishing to see for myself that what you have said follows from explicit negation of the definition using predicate calculus, but I have now convinced myself of this by writing it out on paper.
– Leekboi
Jul 31 at 5:32
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
Your definition of linear dependence is valid, although we usually only write $implies$ since the left-hand arrow is trivial. Of course, to say we can deduce all $alpha_i$ is equivalent to saying there does not exist any other choice of the $alpha_i$ that works. Therefore, the negation is as expected to say that one does.
1
I guess this is pretty obvious, thank you. I was wishing to see for myself that what you have said follows from explicit negation of the definition using predicate calculus, but I have now convinced myself of this by writing it out on paper.
– Leekboi
Jul 31 at 5:32
add a comment |Â
up vote
1
down vote
accepted
Your definition of linear dependence is valid, although we usually only write $implies$ since the left-hand arrow is trivial. Of course, to say we can deduce all $alpha_i$ is equivalent to saying there does not exist any other choice of the $alpha_i$ that works. Therefore, the negation is as expected to say that one does.
1
I guess this is pretty obvious, thank you. I was wishing to see for myself that what you have said follows from explicit negation of the definition using predicate calculus, but I have now convinced myself of this by writing it out on paper.
– Leekboi
Jul 31 at 5:32
add a comment |Â
up vote
1
down vote
accepted
up vote
1
down vote
accepted
Your definition of linear dependence is valid, although we usually only write $implies$ since the left-hand arrow is trivial. Of course, to say we can deduce all $alpha_i$ is equivalent to saying there does not exist any other choice of the $alpha_i$ that works. Therefore, the negation is as expected to say that one does.
Your definition of linear dependence is valid, although we usually only write $implies$ since the left-hand arrow is trivial. Of course, to say we can deduce all $alpha_i$ is equivalent to saying there does not exist any other choice of the $alpha_i$ that works. Therefore, the negation is as expected to say that one does.
answered Jul 31 at 5:30
J.G.
12.8k11423
12.8k11423
1
I guess this is pretty obvious, thank you. I was wishing to see for myself that what you have said follows from explicit negation of the definition using predicate calculus, but I have now convinced myself of this by writing it out on paper.
– Leekboi
Jul 31 at 5:32
add a comment |Â
1
I guess this is pretty obvious, thank you. I was wishing to see for myself that what you have said follows from explicit negation of the definition using predicate calculus, but I have now convinced myself of this by writing it out on paper.
– Leekboi
Jul 31 at 5:32
1
1
I guess this is pretty obvious, thank you. I was wishing to see for myself that what you have said follows from explicit negation of the definition using predicate calculus, but I have now convinced myself of this by writing it out on paper.
– Leekboi
Jul 31 at 5:32
I guess this is pretty obvious, thank you. I was wishing to see for myself that what you have said follows from explicit negation of the definition using predicate calculus, but I have now convinced myself of this by writing it out on paper.
– Leekboi
Jul 31 at 5:32
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2867679%2fnegation-of-the-definition-of-linear-independence%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
You are right about the negation.
– xbh
Jul 31 at 5:19
@xbh Yes, I suspected so, but I would like to know how to (i.e go through some explicit steps) to negate the definition to arrive at something equivalent to what I wrote at the end.
– Leekboi
Jul 31 at 5:21
Reverse the quantifiers, i.e. "for all" to "exist one", "exist one" to "for all". Also take the negation of the statement.
– xbh
Jul 31 at 5:23
Example. The definition of linear independence could be rewritten as "if $sum alpha_j boldsymbol v_j = mathbf 0, $then all $alpha_j = 0$" [the other direction holds always under the context of vector space]. Now take the negation of the conclusion above: "all" becomes "some", "$alpha_j = 0$" becomes "$alpha_j neq 0$". Combine them together yields "some $alpha_j $can be nonzero despite that equation holds".
– xbh
Jul 31 at 5:30
1
@J.G. Thanks for pointing out. I have noticed these flaws.
– xbh
Jul 31 at 5:34