Visualize $T(n)=2T(n/2)+O(n)=O(nlog(n))$ on a Tree

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












I understand the mathematical proof for $T(n)=2T(n/2)+O(n)=O(nlog(n))$, however I cannot visually wrap my head around how it works. Intuitively, it just feels like it should $O(n^2)$. Can you show me why these two trains of thought are wrong:



  1. Imagine a binary tree with $n$ nodes. On each node an $O(n)$ operation needs to be executed. A recursive function for this would thus be $T(n)=O(n)+T(textleft)+T(textright)=2T(n/2)+O(n)$. Since there are $n$ nodes and the operation is $O(n)$ time to do thus $T(n)=nO(n)=O(n^2)$.

  2. If you took this binary tree and simplified the operation to be $O(1)$, the complexity for this new tree would be $T(n)/O(n)=O(nlog(n))/O(n)=O(log(n))$ (since things are now less complex by a factor of $O(n)$). However, it is mathematically proven that $T(n)=2T(n/2)+O(1)=O(n)$, not $O(log(n))$.

Note: I understand the mathematical proof for why $T(n)=2T(n/2)+O(n)=O(nlog(n))$ works. However I cannot visualize this or grasp it intuitively.







share|cite|improve this question

























    up vote
    1
    down vote

    favorite












    I understand the mathematical proof for $T(n)=2T(n/2)+O(n)=O(nlog(n))$, however I cannot visually wrap my head around how it works. Intuitively, it just feels like it should $O(n^2)$. Can you show me why these two trains of thought are wrong:



    1. Imagine a binary tree with $n$ nodes. On each node an $O(n)$ operation needs to be executed. A recursive function for this would thus be $T(n)=O(n)+T(textleft)+T(textright)=2T(n/2)+O(n)$. Since there are $n$ nodes and the operation is $O(n)$ time to do thus $T(n)=nO(n)=O(n^2)$.

    2. If you took this binary tree and simplified the operation to be $O(1)$, the complexity for this new tree would be $T(n)/O(n)=O(nlog(n))/O(n)=O(log(n))$ (since things are now less complex by a factor of $O(n)$). However, it is mathematically proven that $T(n)=2T(n/2)+O(1)=O(n)$, not $O(log(n))$.

    Note: I understand the mathematical proof for why $T(n)=2T(n/2)+O(n)=O(nlog(n))$ works. However I cannot visualize this or grasp it intuitively.







    share|cite|improve this question























      up vote
      1
      down vote

      favorite









      up vote
      1
      down vote

      favorite











      I understand the mathematical proof for $T(n)=2T(n/2)+O(n)=O(nlog(n))$, however I cannot visually wrap my head around how it works. Intuitively, it just feels like it should $O(n^2)$. Can you show me why these two trains of thought are wrong:



      1. Imagine a binary tree with $n$ nodes. On each node an $O(n)$ operation needs to be executed. A recursive function for this would thus be $T(n)=O(n)+T(textleft)+T(textright)=2T(n/2)+O(n)$. Since there are $n$ nodes and the operation is $O(n)$ time to do thus $T(n)=nO(n)=O(n^2)$.

      2. If you took this binary tree and simplified the operation to be $O(1)$, the complexity for this new tree would be $T(n)/O(n)=O(nlog(n))/O(n)=O(log(n))$ (since things are now less complex by a factor of $O(n)$). However, it is mathematically proven that $T(n)=2T(n/2)+O(1)=O(n)$, not $O(log(n))$.

      Note: I understand the mathematical proof for why $T(n)=2T(n/2)+O(n)=O(nlog(n))$ works. However I cannot visualize this or grasp it intuitively.







      share|cite|improve this question













      I understand the mathematical proof for $T(n)=2T(n/2)+O(n)=O(nlog(n))$, however I cannot visually wrap my head around how it works. Intuitively, it just feels like it should $O(n^2)$. Can you show me why these two trains of thought are wrong:



      1. Imagine a binary tree with $n$ nodes. On each node an $O(n)$ operation needs to be executed. A recursive function for this would thus be $T(n)=O(n)+T(textleft)+T(textright)=2T(n/2)+O(n)$. Since there are $n$ nodes and the operation is $O(n)$ time to do thus $T(n)=nO(n)=O(n^2)$.

      2. If you took this binary tree and simplified the operation to be $O(1)$, the complexity for this new tree would be $T(n)/O(n)=O(nlog(n))/O(n)=O(log(n))$ (since things are now less complex by a factor of $O(n)$). However, it is mathematically proven that $T(n)=2T(n/2)+O(1)=O(n)$, not $O(log(n))$.

      Note: I understand the mathematical proof for why $T(n)=2T(n/2)+O(n)=O(nlog(n))$ works. However I cannot visualize this or grasp it intuitively.









      share|cite|improve this question












      share|cite|improve this question




      share|cite|improve this question








      edited Jul 18 at 3:09









      zipirovich

      10.1k11630




      10.1k11630









      asked Jul 18 at 2:33









      Dan

      1184




      1184




















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          2
          down vote



          accepted










          I think you're misinterpreting what the master theorem describes. The task here is to process a tree, which is not exactly the same as just processing the nodes. Sorry for the pun, but it's like you're missing the tree in the forest behind the nodes. :-)




          On each node an $O(n)$ operation needs to be executed.




          That's wrong — exactly because the real task is to process a tree, not just individual nodes one at a time. That's why the number of operations is NOT the same at different nodes. For example, simply by the definition of the function $T$: the top node requires $T(n)$ operations — because it's the root of a tree of size $n$ that we need to process; but each leaf, being a tree with a single node, requires only $T(1)$ operations, which is a constant.




          Since there are $n$ nodes and the operation is $O(n)$ time to do thus $T(n)=nO(n)=O(n^2)$.




          Wrong, as explained above — different nodes require different time.




          If you took this binary tree and simplified the operation to be $O(1)$, …




          Same thing.




          … the complexity for this new tree would be $T(n)/O(n)=O(nlog(n))/O(n)=O(log(n))$ …




          Sorry, but this simply doesn't make any sense. You started this calculation with the assumption that $T(n)=O(nlog(n))$, since that's what you replaced $T(n)$ with. Let's even leave aside the question of why you assume that from the beginning if that's what we're trying to deduce (even though it is a serious logical error). But if you already know what $T(n)$ is, how much sense does it make to "deduce" something different for it?




          What the master theorem describes is the divide-and-conquer approach. Your main misconception is that something the same happens at each node, but that's not true. In fact, there are three things that happen at each node (for a binary tree):



          1. We need to process the left subtree, at a cost of $T(textleft)$;

          2. We need to process the right subtree, at a cost of $T(textright)$;

          3. And we need some time to perform additional operations at the node, such as forming subtrees, putting the results from the subtrees together, etc.

          The "$+O(n)$" term in the recursive formula in your example refers to the computational cost of the last part only — for a node which is the root of a (sub)tree of size $n$; it is NOT the entire cost of what happens at a node, and it is NOT the same for all nodes. The total number of operations that we have to perform at such a node is by definition denoted $T(n)$, and it's in fact expressed by the recursive relation
          $$T(n)=O(n)+T(textleft)+T(textright)=2T(n/2)+O(n).$$






          share|cite|improve this answer





















          • "You started this calculation with the assumption that $T(n)=O(nlog(n))$, since that's what you replaced $T(n)$ with". Sorry if it was not clear, but I was doing a proof by contradiction that $T(n)!=O(nlog(n))$. I started by assuming $T(n)!=O(nlog(n))$ is false so $T(n)=O(nlog(n))$, and then derived the contradiction that $O(n)=O(log(n))$. Also, I was not aware of the master theorem and learned the proofs a different way.
            – Dan
            Jul 18 at 4:15










          • @Dan: Okay, I see! Then for a proof by contradiction this could be the starting point. But then, dividing everything by $O(n)$ is a very random unjustified step -- I don't see any logical reason for doing that, and so the "proof" falls apart anyway. As for the master theorem, you can ignore the reference. But still, the logic of divide-and-conquer is what's at work here.
            – zipirovich
            Jul 18 at 4:36










          • Yes, I now see that is where the error lies
            – Dan
            Jul 18 at 6:10










          Your Answer




          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "69"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: false,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );








           

          draft saved


          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2855132%2fvisualize-tn-2tn-2on-on-logn-on-a-tree%23new-answer', 'question_page');

          );

          Post as a guest






























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          2
          down vote



          accepted










          I think you're misinterpreting what the master theorem describes. The task here is to process a tree, which is not exactly the same as just processing the nodes. Sorry for the pun, but it's like you're missing the tree in the forest behind the nodes. :-)




          On each node an $O(n)$ operation needs to be executed.




          That's wrong — exactly because the real task is to process a tree, not just individual nodes one at a time. That's why the number of operations is NOT the same at different nodes. For example, simply by the definition of the function $T$: the top node requires $T(n)$ operations — because it's the root of a tree of size $n$ that we need to process; but each leaf, being a tree with a single node, requires only $T(1)$ operations, which is a constant.




          Since there are $n$ nodes and the operation is $O(n)$ time to do thus $T(n)=nO(n)=O(n^2)$.




          Wrong, as explained above — different nodes require different time.




          If you took this binary tree and simplified the operation to be $O(1)$, …




          Same thing.




          … the complexity for this new tree would be $T(n)/O(n)=O(nlog(n))/O(n)=O(log(n))$ …




          Sorry, but this simply doesn't make any sense. You started this calculation with the assumption that $T(n)=O(nlog(n))$, since that's what you replaced $T(n)$ with. Let's even leave aside the question of why you assume that from the beginning if that's what we're trying to deduce (even though it is a serious logical error). But if you already know what $T(n)$ is, how much sense does it make to "deduce" something different for it?




          What the master theorem describes is the divide-and-conquer approach. Your main misconception is that something the same happens at each node, but that's not true. In fact, there are three things that happen at each node (for a binary tree):



          1. We need to process the left subtree, at a cost of $T(textleft)$;

          2. We need to process the right subtree, at a cost of $T(textright)$;

          3. And we need some time to perform additional operations at the node, such as forming subtrees, putting the results from the subtrees together, etc.

          The "$+O(n)$" term in the recursive formula in your example refers to the computational cost of the last part only — for a node which is the root of a (sub)tree of size $n$; it is NOT the entire cost of what happens at a node, and it is NOT the same for all nodes. The total number of operations that we have to perform at such a node is by definition denoted $T(n)$, and it's in fact expressed by the recursive relation
          $$T(n)=O(n)+T(textleft)+T(textright)=2T(n/2)+O(n).$$






          share|cite|improve this answer





















          • "You started this calculation with the assumption that $T(n)=O(nlog(n))$, since that's what you replaced $T(n)$ with". Sorry if it was not clear, but I was doing a proof by contradiction that $T(n)!=O(nlog(n))$. I started by assuming $T(n)!=O(nlog(n))$ is false so $T(n)=O(nlog(n))$, and then derived the contradiction that $O(n)=O(log(n))$. Also, I was not aware of the master theorem and learned the proofs a different way.
            – Dan
            Jul 18 at 4:15










          • @Dan: Okay, I see! Then for a proof by contradiction this could be the starting point. But then, dividing everything by $O(n)$ is a very random unjustified step -- I don't see any logical reason for doing that, and so the "proof" falls apart anyway. As for the master theorem, you can ignore the reference. But still, the logic of divide-and-conquer is what's at work here.
            – zipirovich
            Jul 18 at 4:36










          • Yes, I now see that is where the error lies
            – Dan
            Jul 18 at 6:10














          up vote
          2
          down vote



          accepted










          I think you're misinterpreting what the master theorem describes. The task here is to process a tree, which is not exactly the same as just processing the nodes. Sorry for the pun, but it's like you're missing the tree in the forest behind the nodes. :-)




          On each node an $O(n)$ operation needs to be executed.




          That's wrong — exactly because the real task is to process a tree, not just individual nodes one at a time. That's why the number of operations is NOT the same at different nodes. For example, simply by the definition of the function $T$: the top node requires $T(n)$ operations — because it's the root of a tree of size $n$ that we need to process; but each leaf, being a tree with a single node, requires only $T(1)$ operations, which is a constant.




          Since there are $n$ nodes and the operation is $O(n)$ time to do thus $T(n)=nO(n)=O(n^2)$.




          Wrong, as explained above — different nodes require different time.




          If you took this binary tree and simplified the operation to be $O(1)$, …




          Same thing.




          … the complexity for this new tree would be $T(n)/O(n)=O(nlog(n))/O(n)=O(log(n))$ …




          Sorry, but this simply doesn't make any sense. You started this calculation with the assumption that $T(n)=O(nlog(n))$, since that's what you replaced $T(n)$ with. Let's even leave aside the question of why you assume that from the beginning if that's what we're trying to deduce (even though it is a serious logical error). But if you already know what $T(n)$ is, how much sense does it make to "deduce" something different for it?




          What the master theorem describes is the divide-and-conquer approach. Your main misconception is that something the same happens at each node, but that's not true. In fact, there are three things that happen at each node (for a binary tree):



          1. We need to process the left subtree, at a cost of $T(textleft)$;

          2. We need to process the right subtree, at a cost of $T(textright)$;

          3. And we need some time to perform additional operations at the node, such as forming subtrees, putting the results from the subtrees together, etc.

          The "$+O(n)$" term in the recursive formula in your example refers to the computational cost of the last part only — for a node which is the root of a (sub)tree of size $n$; it is NOT the entire cost of what happens at a node, and it is NOT the same for all nodes. The total number of operations that we have to perform at such a node is by definition denoted $T(n)$, and it's in fact expressed by the recursive relation
          $$T(n)=O(n)+T(textleft)+T(textright)=2T(n/2)+O(n).$$






          share|cite|improve this answer





















          • "You started this calculation with the assumption that $T(n)=O(nlog(n))$, since that's what you replaced $T(n)$ with". Sorry if it was not clear, but I was doing a proof by contradiction that $T(n)!=O(nlog(n))$. I started by assuming $T(n)!=O(nlog(n))$ is false so $T(n)=O(nlog(n))$, and then derived the contradiction that $O(n)=O(log(n))$. Also, I was not aware of the master theorem and learned the proofs a different way.
            – Dan
            Jul 18 at 4:15










          • @Dan: Okay, I see! Then for a proof by contradiction this could be the starting point. But then, dividing everything by $O(n)$ is a very random unjustified step -- I don't see any logical reason for doing that, and so the "proof" falls apart anyway. As for the master theorem, you can ignore the reference. But still, the logic of divide-and-conquer is what's at work here.
            – zipirovich
            Jul 18 at 4:36










          • Yes, I now see that is where the error lies
            – Dan
            Jul 18 at 6:10












          up vote
          2
          down vote



          accepted







          up vote
          2
          down vote



          accepted






          I think you're misinterpreting what the master theorem describes. The task here is to process a tree, which is not exactly the same as just processing the nodes. Sorry for the pun, but it's like you're missing the tree in the forest behind the nodes. :-)




          On each node an $O(n)$ operation needs to be executed.




          That's wrong — exactly because the real task is to process a tree, not just individual nodes one at a time. That's why the number of operations is NOT the same at different nodes. For example, simply by the definition of the function $T$: the top node requires $T(n)$ operations — because it's the root of a tree of size $n$ that we need to process; but each leaf, being a tree with a single node, requires only $T(1)$ operations, which is a constant.




          Since there are $n$ nodes and the operation is $O(n)$ time to do thus $T(n)=nO(n)=O(n^2)$.




          Wrong, as explained above — different nodes require different time.




          If you took this binary tree and simplified the operation to be $O(1)$, …




          Same thing.




          … the complexity for this new tree would be $T(n)/O(n)=O(nlog(n))/O(n)=O(log(n))$ …




          Sorry, but this simply doesn't make any sense. You started this calculation with the assumption that $T(n)=O(nlog(n))$, since that's what you replaced $T(n)$ with. Let's even leave aside the question of why you assume that from the beginning if that's what we're trying to deduce (even though it is a serious logical error). But if you already know what $T(n)$ is, how much sense does it make to "deduce" something different for it?




          What the master theorem describes is the divide-and-conquer approach. Your main misconception is that something the same happens at each node, but that's not true. In fact, there are three things that happen at each node (for a binary tree):



          1. We need to process the left subtree, at a cost of $T(textleft)$;

          2. We need to process the right subtree, at a cost of $T(textright)$;

          3. And we need some time to perform additional operations at the node, such as forming subtrees, putting the results from the subtrees together, etc.

          The "$+O(n)$" term in the recursive formula in your example refers to the computational cost of the last part only — for a node which is the root of a (sub)tree of size $n$; it is NOT the entire cost of what happens at a node, and it is NOT the same for all nodes. The total number of operations that we have to perform at such a node is by definition denoted $T(n)$, and it's in fact expressed by the recursive relation
          $$T(n)=O(n)+T(textleft)+T(textright)=2T(n/2)+O(n).$$






          share|cite|improve this answer













          I think you're misinterpreting what the master theorem describes. The task here is to process a tree, which is not exactly the same as just processing the nodes. Sorry for the pun, but it's like you're missing the tree in the forest behind the nodes. :-)




          On each node an $O(n)$ operation needs to be executed.




          That's wrong — exactly because the real task is to process a tree, not just individual nodes one at a time. That's why the number of operations is NOT the same at different nodes. For example, simply by the definition of the function $T$: the top node requires $T(n)$ operations — because it's the root of a tree of size $n$ that we need to process; but each leaf, being a tree with a single node, requires only $T(1)$ operations, which is a constant.




          Since there are $n$ nodes and the operation is $O(n)$ time to do thus $T(n)=nO(n)=O(n^2)$.




          Wrong, as explained above — different nodes require different time.




          If you took this binary tree and simplified the operation to be $O(1)$, …




          Same thing.




          … the complexity for this new tree would be $T(n)/O(n)=O(nlog(n))/O(n)=O(log(n))$ …




          Sorry, but this simply doesn't make any sense. You started this calculation with the assumption that $T(n)=O(nlog(n))$, since that's what you replaced $T(n)$ with. Let's even leave aside the question of why you assume that from the beginning if that's what we're trying to deduce (even though it is a serious logical error). But if you already know what $T(n)$ is, how much sense does it make to "deduce" something different for it?




          What the master theorem describes is the divide-and-conquer approach. Your main misconception is that something the same happens at each node, but that's not true. In fact, there are three things that happen at each node (for a binary tree):



          1. We need to process the left subtree, at a cost of $T(textleft)$;

          2. We need to process the right subtree, at a cost of $T(textright)$;

          3. And we need some time to perform additional operations at the node, such as forming subtrees, putting the results from the subtrees together, etc.

          The "$+O(n)$" term in the recursive formula in your example refers to the computational cost of the last part only — for a node which is the root of a (sub)tree of size $n$; it is NOT the entire cost of what happens at a node, and it is NOT the same for all nodes. The total number of operations that we have to perform at such a node is by definition denoted $T(n)$, and it's in fact expressed by the recursive relation
          $$T(n)=O(n)+T(textleft)+T(textright)=2T(n/2)+O(n).$$







          share|cite|improve this answer













          share|cite|improve this answer



          share|cite|improve this answer











          answered Jul 18 at 3:48









          zipirovich

          10.1k11630




          10.1k11630











          • "You started this calculation with the assumption that $T(n)=O(nlog(n))$, since that's what you replaced $T(n)$ with". Sorry if it was not clear, but I was doing a proof by contradiction that $T(n)!=O(nlog(n))$. I started by assuming $T(n)!=O(nlog(n))$ is false so $T(n)=O(nlog(n))$, and then derived the contradiction that $O(n)=O(log(n))$. Also, I was not aware of the master theorem and learned the proofs a different way.
            – Dan
            Jul 18 at 4:15










          • @Dan: Okay, I see! Then for a proof by contradiction this could be the starting point. But then, dividing everything by $O(n)$ is a very random unjustified step -- I don't see any logical reason for doing that, and so the "proof" falls apart anyway. As for the master theorem, you can ignore the reference. But still, the logic of divide-and-conquer is what's at work here.
            – zipirovich
            Jul 18 at 4:36










          • Yes, I now see that is where the error lies
            – Dan
            Jul 18 at 6:10
















          • "You started this calculation with the assumption that $T(n)=O(nlog(n))$, since that's what you replaced $T(n)$ with". Sorry if it was not clear, but I was doing a proof by contradiction that $T(n)!=O(nlog(n))$. I started by assuming $T(n)!=O(nlog(n))$ is false so $T(n)=O(nlog(n))$, and then derived the contradiction that $O(n)=O(log(n))$. Also, I was not aware of the master theorem and learned the proofs a different way.
            – Dan
            Jul 18 at 4:15










          • @Dan: Okay, I see! Then for a proof by contradiction this could be the starting point. But then, dividing everything by $O(n)$ is a very random unjustified step -- I don't see any logical reason for doing that, and so the "proof" falls apart anyway. As for the master theorem, you can ignore the reference. But still, the logic of divide-and-conquer is what's at work here.
            – zipirovich
            Jul 18 at 4:36










          • Yes, I now see that is where the error lies
            – Dan
            Jul 18 at 6:10















          "You started this calculation with the assumption that $T(n)=O(nlog(n))$, since that's what you replaced $T(n)$ with". Sorry if it was not clear, but I was doing a proof by contradiction that $T(n)!=O(nlog(n))$. I started by assuming $T(n)!=O(nlog(n))$ is false so $T(n)=O(nlog(n))$, and then derived the contradiction that $O(n)=O(log(n))$. Also, I was not aware of the master theorem and learned the proofs a different way.
          – Dan
          Jul 18 at 4:15




          "You started this calculation with the assumption that $T(n)=O(nlog(n))$, since that's what you replaced $T(n)$ with". Sorry if it was not clear, but I was doing a proof by contradiction that $T(n)!=O(nlog(n))$. I started by assuming $T(n)!=O(nlog(n))$ is false so $T(n)=O(nlog(n))$, and then derived the contradiction that $O(n)=O(log(n))$. Also, I was not aware of the master theorem and learned the proofs a different way.
          – Dan
          Jul 18 at 4:15












          @Dan: Okay, I see! Then for a proof by contradiction this could be the starting point. But then, dividing everything by $O(n)$ is a very random unjustified step -- I don't see any logical reason for doing that, and so the "proof" falls apart anyway. As for the master theorem, you can ignore the reference. But still, the logic of divide-and-conquer is what's at work here.
          – zipirovich
          Jul 18 at 4:36




          @Dan: Okay, I see! Then for a proof by contradiction this could be the starting point. But then, dividing everything by $O(n)$ is a very random unjustified step -- I don't see any logical reason for doing that, and so the "proof" falls apart anyway. As for the master theorem, you can ignore the reference. But still, the logic of divide-and-conquer is what's at work here.
          – zipirovich
          Jul 18 at 4:36












          Yes, I now see that is where the error lies
          – Dan
          Jul 18 at 6:10




          Yes, I now see that is where the error lies
          – Dan
          Jul 18 at 6:10












           

          draft saved


          draft discarded


























           


          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2855132%2fvisualize-tn-2tn-2on-on-logn-on-a-tree%23new-answer', 'question_page');

          );

          Post as a guest













































































          Comments

          Popular posts from this blog

          What is the equation of a 3D cone with generalised tilt?

          Color the edges and diagonals of a regular polygon

          Relationship between determinant of matrix and determinant of adjoint?