Negation of the Definition of Linear Independence

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












$textbfDefinition.$



Let $V$ be a vector space, and let $textbfv_1,dots,textbfv_n in V$. Let $alpha_1,dots,alpha_n$ be scalars. Let $textbf0$ be the zero element of $V$.



$textbfv_1,dots,textbfv_n$ are said to be linearly independent if $$alpha_1 textbfv_1 + dots + alpha_n textbfv_n = textbf0 Leftrightarrow alpha_1,dots,alpha_n = 0$$



Firstly, is this a valid definition of linear independence?



Secondly, how do I find the negation of this definition of linear independence? I would expect to get something like there exist scalars $alpha_1,dots,alpha_n$, not all zero, such that $alpha_1 textbfv_1 + dots + alpha_n textbfv_n = textbf0$, but I am not sure how I would arrive at something like this.







share|cite|improve this question















  • 1




    You are right about the negation.
    – xbh
    Jul 31 at 5:19










  • @xbh Yes, I suspected so, but I would like to know how to (i.e go through some explicit steps) to negate the definition to arrive at something equivalent to what I wrote at the end.
    – Leekboi
    Jul 31 at 5:21











  • Reverse the quantifiers, i.e. "for all" to "exist one", "exist one" to "for all". Also take the negation of the statement.
    – xbh
    Jul 31 at 5:23










  • Example. The definition of linear independence could be rewritten as "if $sum alpha_j boldsymbol v_j = mathbf 0, $then all $alpha_j = 0$" [the other direction holds always under the context of vector space]. Now take the negation of the conclusion above: "all" becomes "some", "$alpha_j = 0$" becomes "$alpha_j neq 0$". Combine them together yields "some $alpha_j $can be nonzero despite that equation holds".
    – xbh
    Jul 31 at 5:30







  • 1




    @J.G. Thanks for pointing out. I have noticed these flaws.
    – xbh
    Jul 31 at 5:34














up vote
0
down vote

favorite












$textbfDefinition.$



Let $V$ be a vector space, and let $textbfv_1,dots,textbfv_n in V$. Let $alpha_1,dots,alpha_n$ be scalars. Let $textbf0$ be the zero element of $V$.



$textbfv_1,dots,textbfv_n$ are said to be linearly independent if $$alpha_1 textbfv_1 + dots + alpha_n textbfv_n = textbf0 Leftrightarrow alpha_1,dots,alpha_n = 0$$



Firstly, is this a valid definition of linear independence?



Secondly, how do I find the negation of this definition of linear independence? I would expect to get something like there exist scalars $alpha_1,dots,alpha_n$, not all zero, such that $alpha_1 textbfv_1 + dots + alpha_n textbfv_n = textbf0$, but I am not sure how I would arrive at something like this.







share|cite|improve this question















  • 1




    You are right about the negation.
    – xbh
    Jul 31 at 5:19










  • @xbh Yes, I suspected so, but I would like to know how to (i.e go through some explicit steps) to negate the definition to arrive at something equivalent to what I wrote at the end.
    – Leekboi
    Jul 31 at 5:21











  • Reverse the quantifiers, i.e. "for all" to "exist one", "exist one" to "for all". Also take the negation of the statement.
    – xbh
    Jul 31 at 5:23










  • Example. The definition of linear independence could be rewritten as "if $sum alpha_j boldsymbol v_j = mathbf 0, $then all $alpha_j = 0$" [the other direction holds always under the context of vector space]. Now take the negation of the conclusion above: "all" becomes "some", "$alpha_j = 0$" becomes "$alpha_j neq 0$". Combine them together yields "some $alpha_j $can be nonzero despite that equation holds".
    – xbh
    Jul 31 at 5:30







  • 1




    @J.G. Thanks for pointing out. I have noticed these flaws.
    – xbh
    Jul 31 at 5:34












up vote
0
down vote

favorite









up vote
0
down vote

favorite











$textbfDefinition.$



Let $V$ be a vector space, and let $textbfv_1,dots,textbfv_n in V$. Let $alpha_1,dots,alpha_n$ be scalars. Let $textbf0$ be the zero element of $V$.



$textbfv_1,dots,textbfv_n$ are said to be linearly independent if $$alpha_1 textbfv_1 + dots + alpha_n textbfv_n = textbf0 Leftrightarrow alpha_1,dots,alpha_n = 0$$



Firstly, is this a valid definition of linear independence?



Secondly, how do I find the negation of this definition of linear independence? I would expect to get something like there exist scalars $alpha_1,dots,alpha_n$, not all zero, such that $alpha_1 textbfv_1 + dots + alpha_n textbfv_n = textbf0$, but I am not sure how I would arrive at something like this.







share|cite|improve this question











$textbfDefinition.$



Let $V$ be a vector space, and let $textbfv_1,dots,textbfv_n in V$. Let $alpha_1,dots,alpha_n$ be scalars. Let $textbf0$ be the zero element of $V$.



$textbfv_1,dots,textbfv_n$ are said to be linearly independent if $$alpha_1 textbfv_1 + dots + alpha_n textbfv_n = textbf0 Leftrightarrow alpha_1,dots,alpha_n = 0$$



Firstly, is this a valid definition of linear independence?



Secondly, how do I find the negation of this definition of linear independence? I would expect to get something like there exist scalars $alpha_1,dots,alpha_n$, not all zero, such that $alpha_1 textbfv_1 + dots + alpha_n textbfv_n = textbf0$, but I am not sure how I would arrive at something like this.









share|cite|improve this question










share|cite|improve this question




share|cite|improve this question









asked Jul 31 at 5:14









Leekboi

392312




392312







  • 1




    You are right about the negation.
    – xbh
    Jul 31 at 5:19










  • @xbh Yes, I suspected so, but I would like to know how to (i.e go through some explicit steps) to negate the definition to arrive at something equivalent to what I wrote at the end.
    – Leekboi
    Jul 31 at 5:21











  • Reverse the quantifiers, i.e. "for all" to "exist one", "exist one" to "for all". Also take the negation of the statement.
    – xbh
    Jul 31 at 5:23










  • Example. The definition of linear independence could be rewritten as "if $sum alpha_j boldsymbol v_j = mathbf 0, $then all $alpha_j = 0$" [the other direction holds always under the context of vector space]. Now take the negation of the conclusion above: "all" becomes "some", "$alpha_j = 0$" becomes "$alpha_j neq 0$". Combine them together yields "some $alpha_j $can be nonzero despite that equation holds".
    – xbh
    Jul 31 at 5:30







  • 1




    @J.G. Thanks for pointing out. I have noticed these flaws.
    – xbh
    Jul 31 at 5:34












  • 1




    You are right about the negation.
    – xbh
    Jul 31 at 5:19










  • @xbh Yes, I suspected so, but I would like to know how to (i.e go through some explicit steps) to negate the definition to arrive at something equivalent to what I wrote at the end.
    – Leekboi
    Jul 31 at 5:21











  • Reverse the quantifiers, i.e. "for all" to "exist one", "exist one" to "for all". Also take the negation of the statement.
    – xbh
    Jul 31 at 5:23










  • Example. The definition of linear independence could be rewritten as "if $sum alpha_j boldsymbol v_j = mathbf 0, $then all $alpha_j = 0$" [the other direction holds always under the context of vector space]. Now take the negation of the conclusion above: "all" becomes "some", "$alpha_j = 0$" becomes "$alpha_j neq 0$". Combine them together yields "some $alpha_j $can be nonzero despite that equation holds".
    – xbh
    Jul 31 at 5:30







  • 1




    @J.G. Thanks for pointing out. I have noticed these flaws.
    – xbh
    Jul 31 at 5:34







1




1




You are right about the negation.
– xbh
Jul 31 at 5:19




You are right about the negation.
– xbh
Jul 31 at 5:19












@xbh Yes, I suspected so, but I would like to know how to (i.e go through some explicit steps) to negate the definition to arrive at something equivalent to what I wrote at the end.
– Leekboi
Jul 31 at 5:21





@xbh Yes, I suspected so, but I would like to know how to (i.e go through some explicit steps) to negate the definition to arrive at something equivalent to what I wrote at the end.
– Leekboi
Jul 31 at 5:21













Reverse the quantifiers, i.e. "for all" to "exist one", "exist one" to "for all". Also take the negation of the statement.
– xbh
Jul 31 at 5:23




Reverse the quantifiers, i.e. "for all" to "exist one", "exist one" to "for all". Also take the negation of the statement.
– xbh
Jul 31 at 5:23












Example. The definition of linear independence could be rewritten as "if $sum alpha_j boldsymbol v_j = mathbf 0, $then all $alpha_j = 0$" [the other direction holds always under the context of vector space]. Now take the negation of the conclusion above: "all" becomes "some", "$alpha_j = 0$" becomes "$alpha_j neq 0$". Combine them together yields "some $alpha_j $can be nonzero despite that equation holds".
– xbh
Jul 31 at 5:30





Example. The definition of linear independence could be rewritten as "if $sum alpha_j boldsymbol v_j = mathbf 0, $then all $alpha_j = 0$" [the other direction holds always under the context of vector space]. Now take the negation of the conclusion above: "all" becomes "some", "$alpha_j = 0$" becomes "$alpha_j neq 0$". Combine them together yields "some $alpha_j $can be nonzero despite that equation holds".
– xbh
Jul 31 at 5:30





1




1




@J.G. Thanks for pointing out. I have noticed these flaws.
– xbh
Jul 31 at 5:34




@J.G. Thanks for pointing out. I have noticed these flaws.
– xbh
Jul 31 at 5:34










1 Answer
1






active

oldest

votes

















up vote
1
down vote



accepted










Your definition of linear dependence is valid, although we usually only write $implies$ since the left-hand arrow is trivial. Of course, to say we can deduce all $alpha_i$ is equivalent to saying there does not exist any other choice of the $alpha_i$ that works. Therefore, the negation is as expected to say that one does.






share|cite|improve this answer

















  • 1




    I guess this is pretty obvious, thank you. I was wishing to see for myself that what you have said follows from explicit negation of the definition using predicate calculus, but I have now convinced myself of this by writing it out on paper.
    – Leekboi
    Jul 31 at 5:32











Your Answer




StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);








 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2867679%2fnegation-of-the-definition-of-linear-independence%23new-answer', 'question_page');

);

Post as a guest






























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
1
down vote



accepted










Your definition of linear dependence is valid, although we usually only write $implies$ since the left-hand arrow is trivial. Of course, to say we can deduce all $alpha_i$ is equivalent to saying there does not exist any other choice of the $alpha_i$ that works. Therefore, the negation is as expected to say that one does.






share|cite|improve this answer

















  • 1




    I guess this is pretty obvious, thank you. I was wishing to see for myself that what you have said follows from explicit negation of the definition using predicate calculus, but I have now convinced myself of this by writing it out on paper.
    – Leekboi
    Jul 31 at 5:32















up vote
1
down vote



accepted










Your definition of linear dependence is valid, although we usually only write $implies$ since the left-hand arrow is trivial. Of course, to say we can deduce all $alpha_i$ is equivalent to saying there does not exist any other choice of the $alpha_i$ that works. Therefore, the negation is as expected to say that one does.






share|cite|improve this answer

















  • 1




    I guess this is pretty obvious, thank you. I was wishing to see for myself that what you have said follows from explicit negation of the definition using predicate calculus, but I have now convinced myself of this by writing it out on paper.
    – Leekboi
    Jul 31 at 5:32













up vote
1
down vote



accepted







up vote
1
down vote



accepted






Your definition of linear dependence is valid, although we usually only write $implies$ since the left-hand arrow is trivial. Of course, to say we can deduce all $alpha_i$ is equivalent to saying there does not exist any other choice of the $alpha_i$ that works. Therefore, the negation is as expected to say that one does.






share|cite|improve this answer













Your definition of linear dependence is valid, although we usually only write $implies$ since the left-hand arrow is trivial. Of course, to say we can deduce all $alpha_i$ is equivalent to saying there does not exist any other choice of the $alpha_i$ that works. Therefore, the negation is as expected to say that one does.







share|cite|improve this answer













share|cite|improve this answer



share|cite|improve this answer











answered Jul 31 at 5:30









J.G.

12.8k11423




12.8k11423







  • 1




    I guess this is pretty obvious, thank you. I was wishing to see for myself that what you have said follows from explicit negation of the definition using predicate calculus, but I have now convinced myself of this by writing it out on paper.
    – Leekboi
    Jul 31 at 5:32













  • 1




    I guess this is pretty obvious, thank you. I was wishing to see for myself that what you have said follows from explicit negation of the definition using predicate calculus, but I have now convinced myself of this by writing it out on paper.
    – Leekboi
    Jul 31 at 5:32








1




1




I guess this is pretty obvious, thank you. I was wishing to see for myself that what you have said follows from explicit negation of the definition using predicate calculus, but I have now convinced myself of this by writing it out on paper.
– Leekboi
Jul 31 at 5:32





I guess this is pretty obvious, thank you. I was wishing to see for myself that what you have said follows from explicit negation of the definition using predicate calculus, but I have now convinced myself of this by writing it out on paper.
– Leekboi
Jul 31 at 5:32













 

draft saved


draft discarded


























 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2867679%2fnegation-of-the-definition-of-linear-independence%23new-answer', 'question_page');

);

Post as a guest













































































Comments

Popular posts from this blog

What is the equation of a 3D cone with generalised tilt?

Color the edges and diagonals of a regular polygon

Relationship between determinant of matrix and determinant of adjoint?