Question concerning linear combinations of vectors and linear independence in Linear Algebra.

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












My question concerns the definition of linear combinations and a criteria for linear independence of a Set (either finite or infinite).



Here is the following Definition and Criteria given:



A vector $v$ in vector space $V$ is a linear combination of vectors of set $S$ if there is a a finite number of vectors $x_1, cdots , x_n$ and scalars $a_1, cdots , a_n$ such that
$$v= a_1x_1 + cdots + a_nx_n$$



Question:



Here is finite restricted to more than $1$ vector. Or does this definition include the possibility of a vector $v$ being a linear combination of no vectors?



Also, can these finite vectors in set S be the same. That is must they all be distinct?



Criteria for Linear Independence:



My book writes the following fact:



A set is linearly independent if and only if the only representations of the zero vector as linear combinations of its vectors are trivial representations.



Question:



Taking in mind the definition of linear dependence requires a nontrivial representation of the zero vector as a linear combination of DISTINCT vectors of the set examined for linear dependence, why does this criteria for linear Independence say "as linear combinations of its vectors" as opposed to "as linear combinations of its distinct vectors" ?



Lastly, if a set is linear independent, does that imply the set contains distinct vectors?



Does it make sense to talk about linear dependence and linear independence in the context of a set of vectors with some vectors repeated. That is with a set of vectors with element all not distinct?



Thanks in Advance.







share|cite|improve this question





















  • To your concern of distinctiveness, since you're talking about "set", so vectors in a set are distinct by default.
    – Niing
    Jul 22 at 8:34










  • I'm reading the same book, my question may be helpful to you: math.stackexchange.com/q/2675139/390226
    – Niing
    Jul 22 at 8:38














up vote
0
down vote

favorite












My question concerns the definition of linear combinations and a criteria for linear independence of a Set (either finite or infinite).



Here is the following Definition and Criteria given:



A vector $v$ in vector space $V$ is a linear combination of vectors of set $S$ if there is a a finite number of vectors $x_1, cdots , x_n$ and scalars $a_1, cdots , a_n$ such that
$$v= a_1x_1 + cdots + a_nx_n$$



Question:



Here is finite restricted to more than $1$ vector. Or does this definition include the possibility of a vector $v$ being a linear combination of no vectors?



Also, can these finite vectors in set S be the same. That is must they all be distinct?



Criteria for Linear Independence:



My book writes the following fact:



A set is linearly independent if and only if the only representations of the zero vector as linear combinations of its vectors are trivial representations.



Question:



Taking in mind the definition of linear dependence requires a nontrivial representation of the zero vector as a linear combination of DISTINCT vectors of the set examined for linear dependence, why does this criteria for linear Independence say "as linear combinations of its vectors" as opposed to "as linear combinations of its distinct vectors" ?



Lastly, if a set is linear independent, does that imply the set contains distinct vectors?



Does it make sense to talk about linear dependence and linear independence in the context of a set of vectors with some vectors repeated. That is with a set of vectors with element all not distinct?



Thanks in Advance.







share|cite|improve this question





















  • To your concern of distinctiveness, since you're talking about "set", so vectors in a set are distinct by default.
    – Niing
    Jul 22 at 8:34










  • I'm reading the same book, my question may be helpful to you: math.stackexchange.com/q/2675139/390226
    – Niing
    Jul 22 at 8:38












up vote
0
down vote

favorite









up vote
0
down vote

favorite











My question concerns the definition of linear combinations and a criteria for linear independence of a Set (either finite or infinite).



Here is the following Definition and Criteria given:



A vector $v$ in vector space $V$ is a linear combination of vectors of set $S$ if there is a a finite number of vectors $x_1, cdots , x_n$ and scalars $a_1, cdots , a_n$ such that
$$v= a_1x_1 + cdots + a_nx_n$$



Question:



Here is finite restricted to more than $1$ vector. Or does this definition include the possibility of a vector $v$ being a linear combination of no vectors?



Also, can these finite vectors in set S be the same. That is must they all be distinct?



Criteria for Linear Independence:



My book writes the following fact:



A set is linearly independent if and only if the only representations of the zero vector as linear combinations of its vectors are trivial representations.



Question:



Taking in mind the definition of linear dependence requires a nontrivial representation of the zero vector as a linear combination of DISTINCT vectors of the set examined for linear dependence, why does this criteria for linear Independence say "as linear combinations of its vectors" as opposed to "as linear combinations of its distinct vectors" ?



Lastly, if a set is linear independent, does that imply the set contains distinct vectors?



Does it make sense to talk about linear dependence and linear independence in the context of a set of vectors with some vectors repeated. That is with a set of vectors with element all not distinct?



Thanks in Advance.







share|cite|improve this question













My question concerns the definition of linear combinations and a criteria for linear independence of a Set (either finite or infinite).



Here is the following Definition and Criteria given:



A vector $v$ in vector space $V$ is a linear combination of vectors of set $S$ if there is a a finite number of vectors $x_1, cdots , x_n$ and scalars $a_1, cdots , a_n$ such that
$$v= a_1x_1 + cdots + a_nx_n$$



Question:



Here is finite restricted to more than $1$ vector. Or does this definition include the possibility of a vector $v$ being a linear combination of no vectors?



Also, can these finite vectors in set S be the same. That is must they all be distinct?



Criteria for Linear Independence:



My book writes the following fact:



A set is linearly independent if and only if the only representations of the zero vector as linear combinations of its vectors are trivial representations.



Question:



Taking in mind the definition of linear dependence requires a nontrivial representation of the zero vector as a linear combination of DISTINCT vectors of the set examined for linear dependence, why does this criteria for linear Independence say "as linear combinations of its vectors" as opposed to "as linear combinations of its distinct vectors" ?



Lastly, if a set is linear independent, does that imply the set contains distinct vectors?



Does it make sense to talk about linear dependence and linear independence in the context of a set of vectors with some vectors repeated. That is with a set of vectors with element all not distinct?



Thanks in Advance.









share|cite|improve this question












share|cite|improve this question




share|cite|improve this question








edited Jul 22 at 5:01









Aniruddha Deshmukh

533417




533417









asked Jul 22 at 4:17









Gabe

31




31











  • To your concern of distinctiveness, since you're talking about "set", so vectors in a set are distinct by default.
    – Niing
    Jul 22 at 8:34










  • I'm reading the same book, my question may be helpful to you: math.stackexchange.com/q/2675139/390226
    – Niing
    Jul 22 at 8:38
















  • To your concern of distinctiveness, since you're talking about "set", so vectors in a set are distinct by default.
    – Niing
    Jul 22 at 8:34










  • I'm reading the same book, my question may be helpful to you: math.stackexchange.com/q/2675139/390226
    – Niing
    Jul 22 at 8:38















To your concern of distinctiveness, since you're talking about "set", so vectors in a set are distinct by default.
– Niing
Jul 22 at 8:34




To your concern of distinctiveness, since you're talking about "set", so vectors in a set are distinct by default.
– Niing
Jul 22 at 8:34












I'm reading the same book, my question may be helpful to you: math.stackexchange.com/q/2675139/390226
– Niing
Jul 22 at 8:38




I'm reading the same book, my question may be helpful to you: math.stackexchange.com/q/2675139/390226
– Niing
Jul 22 at 8:38










2 Answers
2






active

oldest

votes

















up vote
0
down vote



accepted










Here are answers to your questions:-



  1. Firstly, when you say scalars, $a_1, a_2, cdots, a_n$, they are real numbers and hence can also be $0$. Keeping this in mind, suppose there is a set $S = leftlbrace v_1, v_2, cdots, v_n rightrbrace subseteq V$. Then, quite obviously, the vector $textbf0 in V$ cab ve written as
    $$0 cdot v_1 + 0 cdot v_2 + cdots + 0 cdot v_n = textbf0$$
    This is what we call the "trivial linear combination".

In fact, what confusion you have in mind is that when you say that a vector is a linear combination of other vectors, there must be at least one vector and one scalar with which you can construct your "linear combination".



  1. When you talk about a "set", elements cannot be repeated. So, there is no point of asking if the elements of the set are distinct.

Lastly, I do not know what book you are following, but I feel that a better version of definitions of linear dependence and independence is the following:-



Linear Independence



A finite set $S = leftlbrace v_1, v_2, cdots, v_n rightrbrace subseteq V$ is said to be linearly independent iff



$$alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = textbf0$$



implies that $alpha_1 = alpha_2 = cdots = alpha_n = 0$. This actually means that the only way you can obtain the zero vector $textbf0$ from a linearly "independent" set is by setting the scalars (coefficients) to be $0$, which we call the "trivial" combination.



In case of an infinite set $S subseteq V$, it is said to be linearly independent iff every finite subset of $S$ is linearly independent. We have definition of linear independence of finite sets which can be used.



Linear Dependence



A finite set $S = leftlbrace v_1, v_2, cdots, v_n rightrbrace subseteq V$ is said to be linearly "dependent" iff it is not linearly independent. Thus, we need to negate the statement for linear independence. The negation of the statement



"$exists alpha_1, alpha_2, cdots, alpha_n in mathbbR$ and $i in leftlbrace 1, 2, cdots, n rightrbrace$ such that $alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = textbf0$ and $alpha_i neq 0$"



This statement means that the vector $v_i in S$ can be actually written as a linear combination of the other vectors. In particular,



$$v_i = left( - dfracalpha_1alpha_i right) cdot v_1 + left( - dfracalpha_2alpha_i right) cdot v_2 + cdots + left( - dfracalpha_i - 1alpha_i right) cdot v_i - 1 + left( - dfracalpha_i + 1alpha_i right) cdot v_i + 1 + cdots + left( - dfracalpha_nalpha_i right) cdot v_n$$



and therefore the vector $v_i in S$ is "dependent" on the other vectors.



In fact, the linear combination $alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = textbf0$ is called the "non - trivial" linear combination.



For an infinite set $S subseteq V$, it is said to be linearly dependent iff it is not linearly independent. Again, we need to negate the statement for linear independence of infinite set. The negation of the statement would be



"There exists a finite set $A subset S$ such that $A$ is not linearly independent". And now, we do have the definition of linear dependence (not linear independence) for finite sets which can be used.



I hope your confusion about distinct elements will be cleared by this. And if you are still confused, try forming sets which are linearly dependent and independent in $mathbbR^2$ and $mathbbR^3$ which you can easily visualize. Also read some material on span of a set and how we can connect linear combination and span with linear dependence and independence.






share|cite|improve this answer





















  • Thanks for the insights. I am using linear algebra by Friedberg . It defines linear dependence of a set as a non trivial linear combination of distinct vectors in the set. Then defines linear independence as the negation of it. However, I'll stick to your given definition as it makes more sense this way.
    – Gabe
    Jul 22 at 5:56

















up vote
0
down vote













The empty set is a finite set by all standards, so this definition also allows for taking $n=0$. That choice requires knowing what is the value of the linear combination of no vectors at all, and that value is the zero vector. As a consequence, the zero vector always is a linear combination of whatever set$~S$ of vectors you specify. Using $n=0$ is essential when $S$ itself the empty set, in which case the zero vector is the only linear combination that one can form.



The formulation is not very clear about whether one could select the same vector more than once when forming a linear combination (it depends on whether one reads a finite "number of vectors" as a finite set or as a finite sequence). However it does not matter since allowing such repetition does not allow any more linear combinations to be formed (one can use $ax+bx=(a+b)x$ to reduce the number of occurrences of the same vector $x$ until no more repetitions occur), and in practice it is most convenient to not forbid it. For instance this allows seeing without any complications that the sum of two linear combinations from the set $S$ is again a linear combinations from the set $S$.



The definition of linear of independence uses a slightly different view on linear combinations, as is it not so much about which vectors are or are not linear combinations of elements of$~S$, but about specific linear combinations (which may be trivial combinations or not). To this end a specific linear combination of elements of$~S$ is determined by specifying for each element of$~S$ a corresponding coefficient, which association must be so that it associates a nonzero scalar only to a finite number of elements of$~S$. The value of such a linear combination is found by forming $a_1x_1+cdots+a_nx_n$ where the sequence of vectors $x_1,ldots,x_n$ contains each vector of$~S$ with a nonzero associated scalar exactly once, and each $a_i$ is the scalar associated to$~x_i$. A specific linear combination is trivial if the scalars associated to the vectors are all$~0$. The set$~S$ is defined to be linearly independent if the only specific linear combination of elements of$~S$ whose value is the zero vector is the trivial linear combination.



Note that this formulation arranges for the same vector to be used only once in a specific linear combination. It must do so, or else a nonzero set could never be linearly independent, because one could take any vector $x$ from the set and form for instance $1x+(-3)x+2x$ as a "nontrivial linear combination" with value the zero vector. Talking as you suggest about a "linear combinations of its distinct vectors" might convey the right idea, but does not make any precise sense (the is no definition of what such a phrase means).



You final question "if a set is linear independent, does that imply the set contains distinct vectors" is pointless, since elements of a set are always distinct: a set cannot contain a given value more than once as element. Your question would make sense if instead it was talking about sequences of families of vectors (since these may have repeated occurrences of the same values). And the answer then is that when there is at least one repetition, the sequence/family is never independent, as the difference between two instances of the same vector would be a nontrivial specific linear combination whose value is the zero vector. By the same token, and set/sequence/family containing the zero vector cannot be linearly independent (taking that vector with scalar $1$ would be a nontrivial linear combination with zero value).






share|cite|improve this answer























    Your Answer




    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "69"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: false,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );








     

    draft saved


    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2859088%2fquestion-concerning-linear-combinations-of-vectors-and-linear-independence-in-li%23new-answer', 'question_page');

    );

    Post as a guest






























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    0
    down vote



    accepted










    Here are answers to your questions:-



    1. Firstly, when you say scalars, $a_1, a_2, cdots, a_n$, they are real numbers and hence can also be $0$. Keeping this in mind, suppose there is a set $S = leftlbrace v_1, v_2, cdots, v_n rightrbrace subseteq V$. Then, quite obviously, the vector $textbf0 in V$ cab ve written as
      $$0 cdot v_1 + 0 cdot v_2 + cdots + 0 cdot v_n = textbf0$$
      This is what we call the "trivial linear combination".

    In fact, what confusion you have in mind is that when you say that a vector is a linear combination of other vectors, there must be at least one vector and one scalar with which you can construct your "linear combination".



    1. When you talk about a "set", elements cannot be repeated. So, there is no point of asking if the elements of the set are distinct.

    Lastly, I do not know what book you are following, but I feel that a better version of definitions of linear dependence and independence is the following:-



    Linear Independence



    A finite set $S = leftlbrace v_1, v_2, cdots, v_n rightrbrace subseteq V$ is said to be linearly independent iff



    $$alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = textbf0$$



    implies that $alpha_1 = alpha_2 = cdots = alpha_n = 0$. This actually means that the only way you can obtain the zero vector $textbf0$ from a linearly "independent" set is by setting the scalars (coefficients) to be $0$, which we call the "trivial" combination.



    In case of an infinite set $S subseteq V$, it is said to be linearly independent iff every finite subset of $S$ is linearly independent. We have definition of linear independence of finite sets which can be used.



    Linear Dependence



    A finite set $S = leftlbrace v_1, v_2, cdots, v_n rightrbrace subseteq V$ is said to be linearly "dependent" iff it is not linearly independent. Thus, we need to negate the statement for linear independence. The negation of the statement



    "$exists alpha_1, alpha_2, cdots, alpha_n in mathbbR$ and $i in leftlbrace 1, 2, cdots, n rightrbrace$ such that $alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = textbf0$ and $alpha_i neq 0$"



    This statement means that the vector $v_i in S$ can be actually written as a linear combination of the other vectors. In particular,



    $$v_i = left( - dfracalpha_1alpha_i right) cdot v_1 + left( - dfracalpha_2alpha_i right) cdot v_2 + cdots + left( - dfracalpha_i - 1alpha_i right) cdot v_i - 1 + left( - dfracalpha_i + 1alpha_i right) cdot v_i + 1 + cdots + left( - dfracalpha_nalpha_i right) cdot v_n$$



    and therefore the vector $v_i in S$ is "dependent" on the other vectors.



    In fact, the linear combination $alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = textbf0$ is called the "non - trivial" linear combination.



    For an infinite set $S subseteq V$, it is said to be linearly dependent iff it is not linearly independent. Again, we need to negate the statement for linear independence of infinite set. The negation of the statement would be



    "There exists a finite set $A subset S$ such that $A$ is not linearly independent". And now, we do have the definition of linear dependence (not linear independence) for finite sets which can be used.



    I hope your confusion about distinct elements will be cleared by this. And if you are still confused, try forming sets which are linearly dependent and independent in $mathbbR^2$ and $mathbbR^3$ which you can easily visualize. Also read some material on span of a set and how we can connect linear combination and span with linear dependence and independence.






    share|cite|improve this answer





















    • Thanks for the insights. I am using linear algebra by Friedberg . It defines linear dependence of a set as a non trivial linear combination of distinct vectors in the set. Then defines linear independence as the negation of it. However, I'll stick to your given definition as it makes more sense this way.
      – Gabe
      Jul 22 at 5:56














    up vote
    0
    down vote



    accepted










    Here are answers to your questions:-



    1. Firstly, when you say scalars, $a_1, a_2, cdots, a_n$, they are real numbers and hence can also be $0$. Keeping this in mind, suppose there is a set $S = leftlbrace v_1, v_2, cdots, v_n rightrbrace subseteq V$. Then, quite obviously, the vector $textbf0 in V$ cab ve written as
      $$0 cdot v_1 + 0 cdot v_2 + cdots + 0 cdot v_n = textbf0$$
      This is what we call the "trivial linear combination".

    In fact, what confusion you have in mind is that when you say that a vector is a linear combination of other vectors, there must be at least one vector and one scalar with which you can construct your "linear combination".



    1. When you talk about a "set", elements cannot be repeated. So, there is no point of asking if the elements of the set are distinct.

    Lastly, I do not know what book you are following, but I feel that a better version of definitions of linear dependence and independence is the following:-



    Linear Independence



    A finite set $S = leftlbrace v_1, v_2, cdots, v_n rightrbrace subseteq V$ is said to be linearly independent iff



    $$alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = textbf0$$



    implies that $alpha_1 = alpha_2 = cdots = alpha_n = 0$. This actually means that the only way you can obtain the zero vector $textbf0$ from a linearly "independent" set is by setting the scalars (coefficients) to be $0$, which we call the "trivial" combination.



    In case of an infinite set $S subseteq V$, it is said to be linearly independent iff every finite subset of $S$ is linearly independent. We have definition of linear independence of finite sets which can be used.



    Linear Dependence



    A finite set $S = leftlbrace v_1, v_2, cdots, v_n rightrbrace subseteq V$ is said to be linearly "dependent" iff it is not linearly independent. Thus, we need to negate the statement for linear independence. The negation of the statement



    "$exists alpha_1, alpha_2, cdots, alpha_n in mathbbR$ and $i in leftlbrace 1, 2, cdots, n rightrbrace$ such that $alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = textbf0$ and $alpha_i neq 0$"



    This statement means that the vector $v_i in S$ can be actually written as a linear combination of the other vectors. In particular,



    $$v_i = left( - dfracalpha_1alpha_i right) cdot v_1 + left( - dfracalpha_2alpha_i right) cdot v_2 + cdots + left( - dfracalpha_i - 1alpha_i right) cdot v_i - 1 + left( - dfracalpha_i + 1alpha_i right) cdot v_i + 1 + cdots + left( - dfracalpha_nalpha_i right) cdot v_n$$



    and therefore the vector $v_i in S$ is "dependent" on the other vectors.



    In fact, the linear combination $alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = textbf0$ is called the "non - trivial" linear combination.



    For an infinite set $S subseteq V$, it is said to be linearly dependent iff it is not linearly independent. Again, we need to negate the statement for linear independence of infinite set. The negation of the statement would be



    "There exists a finite set $A subset S$ such that $A$ is not linearly independent". And now, we do have the definition of linear dependence (not linear independence) for finite sets which can be used.



    I hope your confusion about distinct elements will be cleared by this. And if you are still confused, try forming sets which are linearly dependent and independent in $mathbbR^2$ and $mathbbR^3$ which you can easily visualize. Also read some material on span of a set and how we can connect linear combination and span with linear dependence and independence.






    share|cite|improve this answer





















    • Thanks for the insights. I am using linear algebra by Friedberg . It defines linear dependence of a set as a non trivial linear combination of distinct vectors in the set. Then defines linear independence as the negation of it. However, I'll stick to your given definition as it makes more sense this way.
      – Gabe
      Jul 22 at 5:56












    up vote
    0
    down vote



    accepted







    up vote
    0
    down vote



    accepted






    Here are answers to your questions:-



    1. Firstly, when you say scalars, $a_1, a_2, cdots, a_n$, they are real numbers and hence can also be $0$. Keeping this in mind, suppose there is a set $S = leftlbrace v_1, v_2, cdots, v_n rightrbrace subseteq V$. Then, quite obviously, the vector $textbf0 in V$ cab ve written as
      $$0 cdot v_1 + 0 cdot v_2 + cdots + 0 cdot v_n = textbf0$$
      This is what we call the "trivial linear combination".

    In fact, what confusion you have in mind is that when you say that a vector is a linear combination of other vectors, there must be at least one vector and one scalar with which you can construct your "linear combination".



    1. When you talk about a "set", elements cannot be repeated. So, there is no point of asking if the elements of the set are distinct.

    Lastly, I do not know what book you are following, but I feel that a better version of definitions of linear dependence and independence is the following:-



    Linear Independence



    A finite set $S = leftlbrace v_1, v_2, cdots, v_n rightrbrace subseteq V$ is said to be linearly independent iff



    $$alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = textbf0$$



    implies that $alpha_1 = alpha_2 = cdots = alpha_n = 0$. This actually means that the only way you can obtain the zero vector $textbf0$ from a linearly "independent" set is by setting the scalars (coefficients) to be $0$, which we call the "trivial" combination.



    In case of an infinite set $S subseteq V$, it is said to be linearly independent iff every finite subset of $S$ is linearly independent. We have definition of linear independence of finite sets which can be used.



    Linear Dependence



    A finite set $S = leftlbrace v_1, v_2, cdots, v_n rightrbrace subseteq V$ is said to be linearly "dependent" iff it is not linearly independent. Thus, we need to negate the statement for linear independence. The negation of the statement



    "$exists alpha_1, alpha_2, cdots, alpha_n in mathbbR$ and $i in leftlbrace 1, 2, cdots, n rightrbrace$ such that $alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = textbf0$ and $alpha_i neq 0$"



    This statement means that the vector $v_i in S$ can be actually written as a linear combination of the other vectors. In particular,



    $$v_i = left( - dfracalpha_1alpha_i right) cdot v_1 + left( - dfracalpha_2alpha_i right) cdot v_2 + cdots + left( - dfracalpha_i - 1alpha_i right) cdot v_i - 1 + left( - dfracalpha_i + 1alpha_i right) cdot v_i + 1 + cdots + left( - dfracalpha_nalpha_i right) cdot v_n$$



    and therefore the vector $v_i in S$ is "dependent" on the other vectors.



    In fact, the linear combination $alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = textbf0$ is called the "non - trivial" linear combination.



    For an infinite set $S subseteq V$, it is said to be linearly dependent iff it is not linearly independent. Again, we need to negate the statement for linear independence of infinite set. The negation of the statement would be



    "There exists a finite set $A subset S$ such that $A$ is not linearly independent". And now, we do have the definition of linear dependence (not linear independence) for finite sets which can be used.



    I hope your confusion about distinct elements will be cleared by this. And if you are still confused, try forming sets which are linearly dependent and independent in $mathbbR^2$ and $mathbbR^3$ which you can easily visualize. Also read some material on span of a set and how we can connect linear combination and span with linear dependence and independence.






    share|cite|improve this answer













    Here are answers to your questions:-



    1. Firstly, when you say scalars, $a_1, a_2, cdots, a_n$, they are real numbers and hence can also be $0$. Keeping this in mind, suppose there is a set $S = leftlbrace v_1, v_2, cdots, v_n rightrbrace subseteq V$. Then, quite obviously, the vector $textbf0 in V$ cab ve written as
      $$0 cdot v_1 + 0 cdot v_2 + cdots + 0 cdot v_n = textbf0$$
      This is what we call the "trivial linear combination".

    In fact, what confusion you have in mind is that when you say that a vector is a linear combination of other vectors, there must be at least one vector and one scalar with which you can construct your "linear combination".



    1. When you talk about a "set", elements cannot be repeated. So, there is no point of asking if the elements of the set are distinct.

    Lastly, I do not know what book you are following, but I feel that a better version of definitions of linear dependence and independence is the following:-



    Linear Independence



    A finite set $S = leftlbrace v_1, v_2, cdots, v_n rightrbrace subseteq V$ is said to be linearly independent iff



    $$alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = textbf0$$



    implies that $alpha_1 = alpha_2 = cdots = alpha_n = 0$. This actually means that the only way you can obtain the zero vector $textbf0$ from a linearly "independent" set is by setting the scalars (coefficients) to be $0$, which we call the "trivial" combination.



    In case of an infinite set $S subseteq V$, it is said to be linearly independent iff every finite subset of $S$ is linearly independent. We have definition of linear independence of finite sets which can be used.



    Linear Dependence



    A finite set $S = leftlbrace v_1, v_2, cdots, v_n rightrbrace subseteq V$ is said to be linearly "dependent" iff it is not linearly independent. Thus, we need to negate the statement for linear independence. The negation of the statement



    "$exists alpha_1, alpha_2, cdots, alpha_n in mathbbR$ and $i in leftlbrace 1, 2, cdots, n rightrbrace$ such that $alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = textbf0$ and $alpha_i neq 0$"



    This statement means that the vector $v_i in S$ can be actually written as a linear combination of the other vectors. In particular,



    $$v_i = left( - dfracalpha_1alpha_i right) cdot v_1 + left( - dfracalpha_2alpha_i right) cdot v_2 + cdots + left( - dfracalpha_i - 1alpha_i right) cdot v_i - 1 + left( - dfracalpha_i + 1alpha_i right) cdot v_i + 1 + cdots + left( - dfracalpha_nalpha_i right) cdot v_n$$



    and therefore the vector $v_i in S$ is "dependent" on the other vectors.



    In fact, the linear combination $alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = textbf0$ is called the "non - trivial" linear combination.



    For an infinite set $S subseteq V$, it is said to be linearly dependent iff it is not linearly independent. Again, we need to negate the statement for linear independence of infinite set. The negation of the statement would be



    "There exists a finite set $A subset S$ such that $A$ is not linearly independent". And now, we do have the definition of linear dependence (not linear independence) for finite sets which can be used.



    I hope your confusion about distinct elements will be cleared by this. And if you are still confused, try forming sets which are linearly dependent and independent in $mathbbR^2$ and $mathbbR^3$ which you can easily visualize. Also read some material on span of a set and how we can connect linear combination and span with linear dependence and independence.







    share|cite|improve this answer













    share|cite|improve this answer



    share|cite|improve this answer











    answered Jul 22 at 5:06









    Aniruddha Deshmukh

    533417




    533417











    • Thanks for the insights. I am using linear algebra by Friedberg . It defines linear dependence of a set as a non trivial linear combination of distinct vectors in the set. Then defines linear independence as the negation of it. However, I'll stick to your given definition as it makes more sense this way.
      – Gabe
      Jul 22 at 5:56
















    • Thanks for the insights. I am using linear algebra by Friedberg . It defines linear dependence of a set as a non trivial linear combination of distinct vectors in the set. Then defines linear independence as the negation of it. However, I'll stick to your given definition as it makes more sense this way.
      – Gabe
      Jul 22 at 5:56















    Thanks for the insights. I am using linear algebra by Friedberg . It defines linear dependence of a set as a non trivial linear combination of distinct vectors in the set. Then defines linear independence as the negation of it. However, I'll stick to your given definition as it makes more sense this way.
    – Gabe
    Jul 22 at 5:56




    Thanks for the insights. I am using linear algebra by Friedberg . It defines linear dependence of a set as a non trivial linear combination of distinct vectors in the set. Then defines linear independence as the negation of it. However, I'll stick to your given definition as it makes more sense this way.
    – Gabe
    Jul 22 at 5:56










    up vote
    0
    down vote













    The empty set is a finite set by all standards, so this definition also allows for taking $n=0$. That choice requires knowing what is the value of the linear combination of no vectors at all, and that value is the zero vector. As a consequence, the zero vector always is a linear combination of whatever set$~S$ of vectors you specify. Using $n=0$ is essential when $S$ itself the empty set, in which case the zero vector is the only linear combination that one can form.



    The formulation is not very clear about whether one could select the same vector more than once when forming a linear combination (it depends on whether one reads a finite "number of vectors" as a finite set or as a finite sequence). However it does not matter since allowing such repetition does not allow any more linear combinations to be formed (one can use $ax+bx=(a+b)x$ to reduce the number of occurrences of the same vector $x$ until no more repetitions occur), and in practice it is most convenient to not forbid it. For instance this allows seeing without any complications that the sum of two linear combinations from the set $S$ is again a linear combinations from the set $S$.



    The definition of linear of independence uses a slightly different view on linear combinations, as is it not so much about which vectors are or are not linear combinations of elements of$~S$, but about specific linear combinations (which may be trivial combinations or not). To this end a specific linear combination of elements of$~S$ is determined by specifying for each element of$~S$ a corresponding coefficient, which association must be so that it associates a nonzero scalar only to a finite number of elements of$~S$. The value of such a linear combination is found by forming $a_1x_1+cdots+a_nx_n$ where the sequence of vectors $x_1,ldots,x_n$ contains each vector of$~S$ with a nonzero associated scalar exactly once, and each $a_i$ is the scalar associated to$~x_i$. A specific linear combination is trivial if the scalars associated to the vectors are all$~0$. The set$~S$ is defined to be linearly independent if the only specific linear combination of elements of$~S$ whose value is the zero vector is the trivial linear combination.



    Note that this formulation arranges for the same vector to be used only once in a specific linear combination. It must do so, or else a nonzero set could never be linearly independent, because one could take any vector $x$ from the set and form for instance $1x+(-3)x+2x$ as a "nontrivial linear combination" with value the zero vector. Talking as you suggest about a "linear combinations of its distinct vectors" might convey the right idea, but does not make any precise sense (the is no definition of what such a phrase means).



    You final question "if a set is linear independent, does that imply the set contains distinct vectors" is pointless, since elements of a set are always distinct: a set cannot contain a given value more than once as element. Your question would make sense if instead it was talking about sequences of families of vectors (since these may have repeated occurrences of the same values). And the answer then is that when there is at least one repetition, the sequence/family is never independent, as the difference between two instances of the same vector would be a nontrivial specific linear combination whose value is the zero vector. By the same token, and set/sequence/family containing the zero vector cannot be linearly independent (taking that vector with scalar $1$ would be a nontrivial linear combination with zero value).






    share|cite|improve this answer



























      up vote
      0
      down vote













      The empty set is a finite set by all standards, so this definition also allows for taking $n=0$. That choice requires knowing what is the value of the linear combination of no vectors at all, and that value is the zero vector. As a consequence, the zero vector always is a linear combination of whatever set$~S$ of vectors you specify. Using $n=0$ is essential when $S$ itself the empty set, in which case the zero vector is the only linear combination that one can form.



      The formulation is not very clear about whether one could select the same vector more than once when forming a linear combination (it depends on whether one reads a finite "number of vectors" as a finite set or as a finite sequence). However it does not matter since allowing such repetition does not allow any more linear combinations to be formed (one can use $ax+bx=(a+b)x$ to reduce the number of occurrences of the same vector $x$ until no more repetitions occur), and in practice it is most convenient to not forbid it. For instance this allows seeing without any complications that the sum of two linear combinations from the set $S$ is again a linear combinations from the set $S$.



      The definition of linear of independence uses a slightly different view on linear combinations, as is it not so much about which vectors are or are not linear combinations of elements of$~S$, but about specific linear combinations (which may be trivial combinations or not). To this end a specific linear combination of elements of$~S$ is determined by specifying for each element of$~S$ a corresponding coefficient, which association must be so that it associates a nonzero scalar only to a finite number of elements of$~S$. The value of such a linear combination is found by forming $a_1x_1+cdots+a_nx_n$ where the sequence of vectors $x_1,ldots,x_n$ contains each vector of$~S$ with a nonzero associated scalar exactly once, and each $a_i$ is the scalar associated to$~x_i$. A specific linear combination is trivial if the scalars associated to the vectors are all$~0$. The set$~S$ is defined to be linearly independent if the only specific linear combination of elements of$~S$ whose value is the zero vector is the trivial linear combination.



      Note that this formulation arranges for the same vector to be used only once in a specific linear combination. It must do so, or else a nonzero set could never be linearly independent, because one could take any vector $x$ from the set and form for instance $1x+(-3)x+2x$ as a "nontrivial linear combination" with value the zero vector. Talking as you suggest about a "linear combinations of its distinct vectors" might convey the right idea, but does not make any precise sense (the is no definition of what such a phrase means).



      You final question "if a set is linear independent, does that imply the set contains distinct vectors" is pointless, since elements of a set are always distinct: a set cannot contain a given value more than once as element. Your question would make sense if instead it was talking about sequences of families of vectors (since these may have repeated occurrences of the same values). And the answer then is that when there is at least one repetition, the sequence/family is never independent, as the difference between two instances of the same vector would be a nontrivial specific linear combination whose value is the zero vector. By the same token, and set/sequence/family containing the zero vector cannot be linearly independent (taking that vector with scalar $1$ would be a nontrivial linear combination with zero value).






      share|cite|improve this answer

























        up vote
        0
        down vote










        up vote
        0
        down vote









        The empty set is a finite set by all standards, so this definition also allows for taking $n=0$. That choice requires knowing what is the value of the linear combination of no vectors at all, and that value is the zero vector. As a consequence, the zero vector always is a linear combination of whatever set$~S$ of vectors you specify. Using $n=0$ is essential when $S$ itself the empty set, in which case the zero vector is the only linear combination that one can form.



        The formulation is not very clear about whether one could select the same vector more than once when forming a linear combination (it depends on whether one reads a finite "number of vectors" as a finite set or as a finite sequence). However it does not matter since allowing such repetition does not allow any more linear combinations to be formed (one can use $ax+bx=(a+b)x$ to reduce the number of occurrences of the same vector $x$ until no more repetitions occur), and in practice it is most convenient to not forbid it. For instance this allows seeing without any complications that the sum of two linear combinations from the set $S$ is again a linear combinations from the set $S$.



        The definition of linear of independence uses a slightly different view on linear combinations, as is it not so much about which vectors are or are not linear combinations of elements of$~S$, but about specific linear combinations (which may be trivial combinations or not). To this end a specific linear combination of elements of$~S$ is determined by specifying for each element of$~S$ a corresponding coefficient, which association must be so that it associates a nonzero scalar only to a finite number of elements of$~S$. The value of such a linear combination is found by forming $a_1x_1+cdots+a_nx_n$ where the sequence of vectors $x_1,ldots,x_n$ contains each vector of$~S$ with a nonzero associated scalar exactly once, and each $a_i$ is the scalar associated to$~x_i$. A specific linear combination is trivial if the scalars associated to the vectors are all$~0$. The set$~S$ is defined to be linearly independent if the only specific linear combination of elements of$~S$ whose value is the zero vector is the trivial linear combination.



        Note that this formulation arranges for the same vector to be used only once in a specific linear combination. It must do so, or else a nonzero set could never be linearly independent, because one could take any vector $x$ from the set and form for instance $1x+(-3)x+2x$ as a "nontrivial linear combination" with value the zero vector. Talking as you suggest about a "linear combinations of its distinct vectors" might convey the right idea, but does not make any precise sense (the is no definition of what such a phrase means).



        You final question "if a set is linear independent, does that imply the set contains distinct vectors" is pointless, since elements of a set are always distinct: a set cannot contain a given value more than once as element. Your question would make sense if instead it was talking about sequences of families of vectors (since these may have repeated occurrences of the same values). And the answer then is that when there is at least one repetition, the sequence/family is never independent, as the difference between two instances of the same vector would be a nontrivial specific linear combination whose value is the zero vector. By the same token, and set/sequence/family containing the zero vector cannot be linearly independent (taking that vector with scalar $1$ would be a nontrivial linear combination with zero value).






        share|cite|improve this answer















        The empty set is a finite set by all standards, so this definition also allows for taking $n=0$. That choice requires knowing what is the value of the linear combination of no vectors at all, and that value is the zero vector. As a consequence, the zero vector always is a linear combination of whatever set$~S$ of vectors you specify. Using $n=0$ is essential when $S$ itself the empty set, in which case the zero vector is the only linear combination that one can form.



        The formulation is not very clear about whether one could select the same vector more than once when forming a linear combination (it depends on whether one reads a finite "number of vectors" as a finite set or as a finite sequence). However it does not matter since allowing such repetition does not allow any more linear combinations to be formed (one can use $ax+bx=(a+b)x$ to reduce the number of occurrences of the same vector $x$ until no more repetitions occur), and in practice it is most convenient to not forbid it. For instance this allows seeing without any complications that the sum of two linear combinations from the set $S$ is again a linear combinations from the set $S$.



        The definition of linear of independence uses a slightly different view on linear combinations, as is it not so much about which vectors are or are not linear combinations of elements of$~S$, but about specific linear combinations (which may be trivial combinations or not). To this end a specific linear combination of elements of$~S$ is determined by specifying for each element of$~S$ a corresponding coefficient, which association must be so that it associates a nonzero scalar only to a finite number of elements of$~S$. The value of such a linear combination is found by forming $a_1x_1+cdots+a_nx_n$ where the sequence of vectors $x_1,ldots,x_n$ contains each vector of$~S$ with a nonzero associated scalar exactly once, and each $a_i$ is the scalar associated to$~x_i$. A specific linear combination is trivial if the scalars associated to the vectors are all$~0$. The set$~S$ is defined to be linearly independent if the only specific linear combination of elements of$~S$ whose value is the zero vector is the trivial linear combination.



        Note that this formulation arranges for the same vector to be used only once in a specific linear combination. It must do so, or else a nonzero set could never be linearly independent, because one could take any vector $x$ from the set and form for instance $1x+(-3)x+2x$ as a "nontrivial linear combination" with value the zero vector. Talking as you suggest about a "linear combinations of its distinct vectors" might convey the right idea, but does not make any precise sense (the is no definition of what such a phrase means).



        You final question "if a set is linear independent, does that imply the set contains distinct vectors" is pointless, since elements of a set are always distinct: a set cannot contain a given value more than once as element. Your question would make sense if instead it was talking about sequences of families of vectors (since these may have repeated occurrences of the same values). And the answer then is that when there is at least one repetition, the sequence/family is never independent, as the difference between two instances of the same vector would be a nontrivial specific linear combination whose value is the zero vector. By the same token, and set/sequence/family containing the zero vector cannot be linearly independent (taking that vector with scalar $1$ would be a nontrivial linear combination with zero value).







        share|cite|improve this answer















        share|cite|improve this answer



        share|cite|improve this answer








        edited Jul 22 at 7:17


























        answered Jul 22 at 4:50









        Marc van Leeuwen

        84.8k499212




        84.8k499212






















             

            draft saved


            draft discarded


























             


            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2859088%2fquestion-concerning-linear-combinations-of-vectors-and-linear-independence-in-li%23new-answer', 'question_page');

            );

            Post as a guest













































































            Comments

            Popular posts from this blog

            What is the equation of a 3D cone with generalised tilt?

            Color the edges and diagonals of a regular polygon

            Relationship between determinant of matrix and determinant of adjoint?