elementary proof that dual variables are derivatives of optimal LP solution w.r.t. RHS

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












I'm wondering how can I minimally prove that the dual variable of a linear program (LP) is the derivative of its optimal value w.r.t. to the RHS constants $b$.



$$ Maximize space c^Tx $$
$$ s.t. space Ax ≤ b, x ≥ 0; $$



My understanding of the dual variables is from duality of LP from wikipedia, and rather limited to the natural language interpretation therein.



I saw that it is mentioned here, that




The interpretation of the dual variables as derivatives of the optimal
value of the objective function with respect to the elements of the
right-hand-side is well known in mathematical programming. This result
can be extended to ...




How can this be proved in an elementary way (with a minimum number of citation of other necessary theorems)?



I understand the form the dual problem and the formal definition of dual variables. But I have trouble envisioning the derivative of optimal objective value, since it is not a closed form linear function (but rather a linear function subject to a set of linear inequalities).



Is it possible to convert the optimal solution to a closed form function of the RHS, and then take the derivative using linear algebra? (I guess not).







share|cite|improve this question





















  • Could you give your definition of 'reduced price'? The Lagrangian is a linear function of the dual variables btw.
    – LinAlg
    Jul 23 at 13:55










  • @LinAlg Thanks for your clarification. I may have some misconception about reduced price/cost. Just changed mention of "reduced cost" to "dual variables" to be minimal and consistent with the quote.
    – tinlyx
    Jul 23 at 14:06











  • the statement is true only if the basis does not change; a proof follows trivially from revised simplex
    – LinAlg
    Jul 23 at 14:37














up vote
0
down vote

favorite












I'm wondering how can I minimally prove that the dual variable of a linear program (LP) is the derivative of its optimal value w.r.t. to the RHS constants $b$.



$$ Maximize space c^Tx $$
$$ s.t. space Ax ≤ b, x ≥ 0; $$



My understanding of the dual variables is from duality of LP from wikipedia, and rather limited to the natural language interpretation therein.



I saw that it is mentioned here, that




The interpretation of the dual variables as derivatives of the optimal
value of the objective function with respect to the elements of the
right-hand-side is well known in mathematical programming. This result
can be extended to ...




How can this be proved in an elementary way (with a minimum number of citation of other necessary theorems)?



I understand the form the dual problem and the formal definition of dual variables. But I have trouble envisioning the derivative of optimal objective value, since it is not a closed form linear function (but rather a linear function subject to a set of linear inequalities).



Is it possible to convert the optimal solution to a closed form function of the RHS, and then take the derivative using linear algebra? (I guess not).







share|cite|improve this question





















  • Could you give your definition of 'reduced price'? The Lagrangian is a linear function of the dual variables btw.
    – LinAlg
    Jul 23 at 13:55










  • @LinAlg Thanks for your clarification. I may have some misconception about reduced price/cost. Just changed mention of "reduced cost" to "dual variables" to be minimal and consistent with the quote.
    – tinlyx
    Jul 23 at 14:06











  • the statement is true only if the basis does not change; a proof follows trivially from revised simplex
    – LinAlg
    Jul 23 at 14:37












up vote
0
down vote

favorite









up vote
0
down vote

favorite











I'm wondering how can I minimally prove that the dual variable of a linear program (LP) is the derivative of its optimal value w.r.t. to the RHS constants $b$.



$$ Maximize space c^Tx $$
$$ s.t. space Ax ≤ b, x ≥ 0; $$



My understanding of the dual variables is from duality of LP from wikipedia, and rather limited to the natural language interpretation therein.



I saw that it is mentioned here, that




The interpretation of the dual variables as derivatives of the optimal
value of the objective function with respect to the elements of the
right-hand-side is well known in mathematical programming. This result
can be extended to ...




How can this be proved in an elementary way (with a minimum number of citation of other necessary theorems)?



I understand the form the dual problem and the formal definition of dual variables. But I have trouble envisioning the derivative of optimal objective value, since it is not a closed form linear function (but rather a linear function subject to a set of linear inequalities).



Is it possible to convert the optimal solution to a closed form function of the RHS, and then take the derivative using linear algebra? (I guess not).







share|cite|improve this question













I'm wondering how can I minimally prove that the dual variable of a linear program (LP) is the derivative of its optimal value w.r.t. to the RHS constants $b$.



$$ Maximize space c^Tx $$
$$ s.t. space Ax ≤ b, x ≥ 0; $$



My understanding of the dual variables is from duality of LP from wikipedia, and rather limited to the natural language interpretation therein.



I saw that it is mentioned here, that




The interpretation of the dual variables as derivatives of the optimal
value of the objective function with respect to the elements of the
right-hand-side is well known in mathematical programming. This result
can be extended to ...




How can this be proved in an elementary way (with a minimum number of citation of other necessary theorems)?



I understand the form the dual problem and the formal definition of dual variables. But I have trouble envisioning the derivative of optimal objective value, since it is not a closed form linear function (but rather a linear function subject to a set of linear inequalities).



Is it possible to convert the optimal solution to a closed form function of the RHS, and then take the derivative using linear algebra? (I guess not).









share|cite|improve this question












share|cite|improve this question




share|cite|improve this question








edited Jul 23 at 14:05
























asked Jul 23 at 13:34









tinlyx

90811118




90811118











  • Could you give your definition of 'reduced price'? The Lagrangian is a linear function of the dual variables btw.
    – LinAlg
    Jul 23 at 13:55










  • @LinAlg Thanks for your clarification. I may have some misconception about reduced price/cost. Just changed mention of "reduced cost" to "dual variables" to be minimal and consistent with the quote.
    – tinlyx
    Jul 23 at 14:06











  • the statement is true only if the basis does not change; a proof follows trivially from revised simplex
    – LinAlg
    Jul 23 at 14:37
















  • Could you give your definition of 'reduced price'? The Lagrangian is a linear function of the dual variables btw.
    – LinAlg
    Jul 23 at 13:55










  • @LinAlg Thanks for your clarification. I may have some misconception about reduced price/cost. Just changed mention of "reduced cost" to "dual variables" to be minimal and consistent with the quote.
    – tinlyx
    Jul 23 at 14:06











  • the statement is true only if the basis does not change; a proof follows trivially from revised simplex
    – LinAlg
    Jul 23 at 14:37















Could you give your definition of 'reduced price'? The Lagrangian is a linear function of the dual variables btw.
– LinAlg
Jul 23 at 13:55




Could you give your definition of 'reduced price'? The Lagrangian is a linear function of the dual variables btw.
– LinAlg
Jul 23 at 13:55












@LinAlg Thanks for your clarification. I may have some misconception about reduced price/cost. Just changed mention of "reduced cost" to "dual variables" to be minimal and consistent with the quote.
– tinlyx
Jul 23 at 14:06





@LinAlg Thanks for your clarification. I may have some misconception about reduced price/cost. Just changed mention of "reduced cost" to "dual variables" to be minimal and consistent with the quote.
– tinlyx
Jul 23 at 14:06













the statement is true only if the basis does not change; a proof follows trivially from revised simplex
– LinAlg
Jul 23 at 14:37




the statement is true only if the basis does not change; a proof follows trivially from revised simplex
– LinAlg
Jul 23 at 14:37















active

oldest

votes











Your Answer




StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);








 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2860377%2felementary-proof-that-dual-variables-are-derivatives-of-optimal-lp-solution-w-r%23new-answer', 'question_page');

);

Post as a guest



































active

oldest

votes













active

oldest

votes









active

oldest

votes






active

oldest

votes










 

draft saved


draft discarded


























 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2860377%2felementary-proof-that-dual-variables-are-derivatives-of-optimal-lp-solution-w-r%23new-answer', 'question_page');

);

Post as a guest













































































Comments

Popular posts from this blog

What is the equation of a 3D cone with generalised tilt?

Color the edges and diagonals of a regular polygon

Relationship between determinant of matrix and determinant of adjoint?