Quadratic forms, change of variables

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












If one has a symmetric matrix $A$, one can diagonalize it with an orthonormal change of basis vectors, e.g. $S^TAS$ is diagonal.
Now lets consider the following matrix $$A=beginbmatrix
1&1\
1&2
endbmatrix.
$$
This matrix corresponds to the symmetric form
$$x_1^2+2x_1x_2+2x_2^2=(x_1+x_2)^2+x_2^2.$$



For me this looks like there has to be a way of determining some $S$ as above (without having to diagonalize etc., just by completing of the square) by taking some change of variables $x_1leadsto x_1+x_2, x_2leadsto x_2$. This would be done by the matrix $$S=beginbmatrix
1&1\
0&1
endbmatrix$$
but this doesn't work out for me...



Any help will be gratefully appreciated.



Edit:
Let me reformulate my question. By Sylvester's law of inertia there exists for every symmetric matrix $A$ some basis such that $S^TAS$ is diagonal with only 1,-1 and 0 on the diagonal, where $S$ is a (not necessarily orthogonal) invertible matrix. I want to determine $S$ without having to calculate all the eigenvalues and diagonalize $A$, because the eigenvalues dont occur in the wanted form.







share|cite|improve this question





















  • Generally the method of completing squares may not deduce the same result as orthogonal diagonalization. Maybe you could try another way to complete squares.
    – xbh
    21 hours ago










  • Why not? Isnt there any way to find $S$ by this procedure?
    – Luke Mathwalker
    21 hours ago










  • Completing squares you remain in the same field containing the coefficients, since you only need to divide by $2$ sometimes, take squares, add and multiply. On the other hand, in some cases, and your example is one of those, the eigenvalues don't belong to the same field. In your example, the coefficients are all rational numbers, while the eigenvalues are not. The algorithm will need to have some step that takes you out of the same field.
    – spiralstotheleft
    20 hours ago










  • Ok I see. I dont want to get the eigenvalues, just some diagonal matrix, I'm interested in the signature of the form and a change of basis vectors, so that $S^TAS$ is in the "signature form" with 1,-1 and 0 on the diagonal, so we can reduce modulo square numbers which we write in $S$ to leave the field in the last step. I still believe that this should be possible without having to calculate all the eigenvalues.
    – Luke Mathwalker
    20 hours ago











  • Your $S$ isn't orthogonal. The special thing about symmetric matrices is that they are orthogonally (or unitarily) diagonalizable (that's why you see $S^TAS$, not $S^-1AS$).
    – Arthur
    20 hours ago















up vote
0
down vote

favorite












If one has a symmetric matrix $A$, one can diagonalize it with an orthonormal change of basis vectors, e.g. $S^TAS$ is diagonal.
Now lets consider the following matrix $$A=beginbmatrix
1&1\
1&2
endbmatrix.
$$
This matrix corresponds to the symmetric form
$$x_1^2+2x_1x_2+2x_2^2=(x_1+x_2)^2+x_2^2.$$



For me this looks like there has to be a way of determining some $S$ as above (without having to diagonalize etc., just by completing of the square) by taking some change of variables $x_1leadsto x_1+x_2, x_2leadsto x_2$. This would be done by the matrix $$S=beginbmatrix
1&1\
0&1
endbmatrix$$
but this doesn't work out for me...



Any help will be gratefully appreciated.



Edit:
Let me reformulate my question. By Sylvester's law of inertia there exists for every symmetric matrix $A$ some basis such that $S^TAS$ is diagonal with only 1,-1 and 0 on the diagonal, where $S$ is a (not necessarily orthogonal) invertible matrix. I want to determine $S$ without having to calculate all the eigenvalues and diagonalize $A$, because the eigenvalues dont occur in the wanted form.







share|cite|improve this question





















  • Generally the method of completing squares may not deduce the same result as orthogonal diagonalization. Maybe you could try another way to complete squares.
    – xbh
    21 hours ago










  • Why not? Isnt there any way to find $S$ by this procedure?
    – Luke Mathwalker
    21 hours ago










  • Completing squares you remain in the same field containing the coefficients, since you only need to divide by $2$ sometimes, take squares, add and multiply. On the other hand, in some cases, and your example is one of those, the eigenvalues don't belong to the same field. In your example, the coefficients are all rational numbers, while the eigenvalues are not. The algorithm will need to have some step that takes you out of the same field.
    – spiralstotheleft
    20 hours ago










  • Ok I see. I dont want to get the eigenvalues, just some diagonal matrix, I'm interested in the signature of the form and a change of basis vectors, so that $S^TAS$ is in the "signature form" with 1,-1 and 0 on the diagonal, so we can reduce modulo square numbers which we write in $S$ to leave the field in the last step. I still believe that this should be possible without having to calculate all the eigenvalues.
    – Luke Mathwalker
    20 hours ago











  • Your $S$ isn't orthogonal. The special thing about symmetric matrices is that they are orthogonally (or unitarily) diagonalizable (that's why you see $S^TAS$, not $S^-1AS$).
    – Arthur
    20 hours ago













up vote
0
down vote

favorite









up vote
0
down vote

favorite











If one has a symmetric matrix $A$, one can diagonalize it with an orthonormal change of basis vectors, e.g. $S^TAS$ is diagonal.
Now lets consider the following matrix $$A=beginbmatrix
1&1\
1&2
endbmatrix.
$$
This matrix corresponds to the symmetric form
$$x_1^2+2x_1x_2+2x_2^2=(x_1+x_2)^2+x_2^2.$$



For me this looks like there has to be a way of determining some $S$ as above (without having to diagonalize etc., just by completing of the square) by taking some change of variables $x_1leadsto x_1+x_2, x_2leadsto x_2$. This would be done by the matrix $$S=beginbmatrix
1&1\
0&1
endbmatrix$$
but this doesn't work out for me...



Any help will be gratefully appreciated.



Edit:
Let me reformulate my question. By Sylvester's law of inertia there exists for every symmetric matrix $A$ some basis such that $S^TAS$ is diagonal with only 1,-1 and 0 on the diagonal, where $S$ is a (not necessarily orthogonal) invertible matrix. I want to determine $S$ without having to calculate all the eigenvalues and diagonalize $A$, because the eigenvalues dont occur in the wanted form.







share|cite|improve this question













If one has a symmetric matrix $A$, one can diagonalize it with an orthonormal change of basis vectors, e.g. $S^TAS$ is diagonal.
Now lets consider the following matrix $$A=beginbmatrix
1&1\
1&2
endbmatrix.
$$
This matrix corresponds to the symmetric form
$$x_1^2+2x_1x_2+2x_2^2=(x_1+x_2)^2+x_2^2.$$



For me this looks like there has to be a way of determining some $S$ as above (without having to diagonalize etc., just by completing of the square) by taking some change of variables $x_1leadsto x_1+x_2, x_2leadsto x_2$. This would be done by the matrix $$S=beginbmatrix
1&1\
0&1
endbmatrix$$
but this doesn't work out for me...



Any help will be gratefully appreciated.



Edit:
Let me reformulate my question. By Sylvester's law of inertia there exists for every symmetric matrix $A$ some basis such that $S^TAS$ is diagonal with only 1,-1 and 0 on the diagonal, where $S$ is a (not necessarily orthogonal) invertible matrix. I want to determine $S$ without having to calculate all the eigenvalues and diagonalize $A$, because the eigenvalues dont occur in the wanted form.









share|cite|improve this question












share|cite|improve this question




share|cite|improve this question








edited 19 hours ago
























asked 21 hours ago









Luke Mathwalker

188110




188110











  • Generally the method of completing squares may not deduce the same result as orthogonal diagonalization. Maybe you could try another way to complete squares.
    – xbh
    21 hours ago










  • Why not? Isnt there any way to find $S$ by this procedure?
    – Luke Mathwalker
    21 hours ago










  • Completing squares you remain in the same field containing the coefficients, since you only need to divide by $2$ sometimes, take squares, add and multiply. On the other hand, in some cases, and your example is one of those, the eigenvalues don't belong to the same field. In your example, the coefficients are all rational numbers, while the eigenvalues are not. The algorithm will need to have some step that takes you out of the same field.
    – spiralstotheleft
    20 hours ago










  • Ok I see. I dont want to get the eigenvalues, just some diagonal matrix, I'm interested in the signature of the form and a change of basis vectors, so that $S^TAS$ is in the "signature form" with 1,-1 and 0 on the diagonal, so we can reduce modulo square numbers which we write in $S$ to leave the field in the last step. I still believe that this should be possible without having to calculate all the eigenvalues.
    – Luke Mathwalker
    20 hours ago











  • Your $S$ isn't orthogonal. The special thing about symmetric matrices is that they are orthogonally (or unitarily) diagonalizable (that's why you see $S^TAS$, not $S^-1AS$).
    – Arthur
    20 hours ago

















  • Generally the method of completing squares may not deduce the same result as orthogonal diagonalization. Maybe you could try another way to complete squares.
    – xbh
    21 hours ago










  • Why not? Isnt there any way to find $S$ by this procedure?
    – Luke Mathwalker
    21 hours ago










  • Completing squares you remain in the same field containing the coefficients, since you only need to divide by $2$ sometimes, take squares, add and multiply. On the other hand, in some cases, and your example is one of those, the eigenvalues don't belong to the same field. In your example, the coefficients are all rational numbers, while the eigenvalues are not. The algorithm will need to have some step that takes you out of the same field.
    – spiralstotheleft
    20 hours ago










  • Ok I see. I dont want to get the eigenvalues, just some diagonal matrix, I'm interested in the signature of the form and a change of basis vectors, so that $S^TAS$ is in the "signature form" with 1,-1 and 0 on the diagonal, so we can reduce modulo square numbers which we write in $S$ to leave the field in the last step. I still believe that this should be possible without having to calculate all the eigenvalues.
    – Luke Mathwalker
    20 hours ago











  • Your $S$ isn't orthogonal. The special thing about symmetric matrices is that they are orthogonally (or unitarily) diagonalizable (that's why you see $S^TAS$, not $S^-1AS$).
    – Arthur
    20 hours ago
















Generally the method of completing squares may not deduce the same result as orthogonal diagonalization. Maybe you could try another way to complete squares.
– xbh
21 hours ago




Generally the method of completing squares may not deduce the same result as orthogonal diagonalization. Maybe you could try another way to complete squares.
– xbh
21 hours ago












Why not? Isnt there any way to find $S$ by this procedure?
– Luke Mathwalker
21 hours ago




Why not? Isnt there any way to find $S$ by this procedure?
– Luke Mathwalker
21 hours ago












Completing squares you remain in the same field containing the coefficients, since you only need to divide by $2$ sometimes, take squares, add and multiply. On the other hand, in some cases, and your example is one of those, the eigenvalues don't belong to the same field. In your example, the coefficients are all rational numbers, while the eigenvalues are not. The algorithm will need to have some step that takes you out of the same field.
– spiralstotheleft
20 hours ago




Completing squares you remain in the same field containing the coefficients, since you only need to divide by $2$ sometimes, take squares, add and multiply. On the other hand, in some cases, and your example is one of those, the eigenvalues don't belong to the same field. In your example, the coefficients are all rational numbers, while the eigenvalues are not. The algorithm will need to have some step that takes you out of the same field.
– spiralstotheleft
20 hours ago












Ok I see. I dont want to get the eigenvalues, just some diagonal matrix, I'm interested in the signature of the form and a change of basis vectors, so that $S^TAS$ is in the "signature form" with 1,-1 and 0 on the diagonal, so we can reduce modulo square numbers which we write in $S$ to leave the field in the last step. I still believe that this should be possible without having to calculate all the eigenvalues.
– Luke Mathwalker
20 hours ago





Ok I see. I dont want to get the eigenvalues, just some diagonal matrix, I'm interested in the signature of the form and a change of basis vectors, so that $S^TAS$ is in the "signature form" with 1,-1 and 0 on the diagonal, so we can reduce modulo square numbers which we write in $S$ to leave the field in the last step. I still believe that this should be possible without having to calculate all the eigenvalues.
– Luke Mathwalker
20 hours ago













Your $S$ isn't orthogonal. The special thing about symmetric matrices is that they are orthogonally (or unitarily) diagonalizable (that's why you see $S^TAS$, not $S^-1AS$).
– Arthur
20 hours ago





Your $S$ isn't orthogonal. The special thing about symmetric matrices is that they are orthogonally (or unitarily) diagonalizable (that's why you see $S^TAS$, not $S^-1AS$).
– Arthur
20 hours ago











1 Answer
1






active

oldest

votes

















up vote
0
down vote













We can, given a symmetric matrix of integers or rational numbers, construct $P^THP = D$ with rational $P$ and $det P = pm 1.$ If it is imperative to have diagonal elements restricted to $pm 1, 0,$ we may take the final $D$ and construct a further diagonal $F$ with elements the reciprocals of some square roots to get $FDF$ the way you want. In this particular problem, no need.



Algorithm discussed at http://math.stackexchange.com/questions/1388421/reference-for-linear-algebra-books-that-teach-reverse-hermite-method-for-symmetr
https://en.wikipedia.org/wiki/Sylvester%27s_law_of_inertia

$$ H = left(
beginarrayrr
1 & 1 \
1 & 2 \
endarray
right)
$$
$$ D_0 = H $$
$$ E_j^T D_j-1 E_j = D_j $$
$$ P_j-1 E_j = P_j $$
$$ E_j^-1 Q_j-1 = Q_j $$
$$ P_j Q_j = Q_j P_j = I $$
$$ P_j^T H P_j = D_j $$
$$ Q_j^T D_j Q_j = H $$



$$ H = left(
beginarrayrr
1 & 1 \
1 & 2 \
endarray
right)
$$



==============================================



$$ E_1 = left(
beginarrayrr
1 & - 1 \
0 & 1 \
endarray
right)
$$
$$ P_1 = left(
beginarrayrr
1 & - 1 \
0 & 1 \
endarray
right)
, ; ; ; Q_1 = left(
beginarrayrr
1 & 1 \
0 & 1 \
endarray
right)
, ; ; ; D_1 = left(
beginarrayrr
1 & 0 \
0 & 1 \
endarray
right)
$$



==============================================



$$ P^T H P = D $$
$$left(
beginarrayrr
1 & 0 \
- 1 & 1 \
endarray
right)
left(
beginarrayrr
1 & 1 \
1 & 2 \
endarray
right)
left(
beginarrayrr
1 & - 1 \
0 & 1 \
endarray
right)
= left(
beginarrayrr
1 & 0 \
0 & 1 \
endarray
right)
$$
$$ Q^T D Q = H $$
$$left(
beginarrayrr
1 & 0 \
1 & 1 \
endarray
right)
left(
beginarrayrr
1 & 0 \
0 & 1 \
endarray
right)
left(
beginarrayrr
1 & 1 \
0 & 1 \
endarray
right)
= left(
beginarrayrr
1 & 1 \
1 & 2 \
endarray
right)
$$






share|cite|improve this answer





















    Your Answer




    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "69"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: false,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );








     

    draft saved


    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2872884%2fquadratic-forms-change-of-variables%23new-answer', 'question_page');

    );

    Post as a guest






























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    0
    down vote













    We can, given a symmetric matrix of integers or rational numbers, construct $P^THP = D$ with rational $P$ and $det P = pm 1.$ If it is imperative to have diagonal elements restricted to $pm 1, 0,$ we may take the final $D$ and construct a further diagonal $F$ with elements the reciprocals of some square roots to get $FDF$ the way you want. In this particular problem, no need.



    Algorithm discussed at http://math.stackexchange.com/questions/1388421/reference-for-linear-algebra-books-that-teach-reverse-hermite-method-for-symmetr
    https://en.wikipedia.org/wiki/Sylvester%27s_law_of_inertia

    $$ H = left(
    beginarrayrr
    1 & 1 \
    1 & 2 \
    endarray
    right)
    $$
    $$ D_0 = H $$
    $$ E_j^T D_j-1 E_j = D_j $$
    $$ P_j-1 E_j = P_j $$
    $$ E_j^-1 Q_j-1 = Q_j $$
    $$ P_j Q_j = Q_j P_j = I $$
    $$ P_j^T H P_j = D_j $$
    $$ Q_j^T D_j Q_j = H $$



    $$ H = left(
    beginarrayrr
    1 & 1 \
    1 & 2 \
    endarray
    right)
    $$



    ==============================================



    $$ E_1 = left(
    beginarrayrr
    1 & - 1 \
    0 & 1 \
    endarray
    right)
    $$
    $$ P_1 = left(
    beginarrayrr
    1 & - 1 \
    0 & 1 \
    endarray
    right)
    , ; ; ; Q_1 = left(
    beginarrayrr
    1 & 1 \
    0 & 1 \
    endarray
    right)
    , ; ; ; D_1 = left(
    beginarrayrr
    1 & 0 \
    0 & 1 \
    endarray
    right)
    $$



    ==============================================



    $$ P^T H P = D $$
    $$left(
    beginarrayrr
    1 & 0 \
    - 1 & 1 \
    endarray
    right)
    left(
    beginarrayrr
    1 & 1 \
    1 & 2 \
    endarray
    right)
    left(
    beginarrayrr
    1 & - 1 \
    0 & 1 \
    endarray
    right)
    = left(
    beginarrayrr
    1 & 0 \
    0 & 1 \
    endarray
    right)
    $$
    $$ Q^T D Q = H $$
    $$left(
    beginarrayrr
    1 & 0 \
    1 & 1 \
    endarray
    right)
    left(
    beginarrayrr
    1 & 0 \
    0 & 1 \
    endarray
    right)
    left(
    beginarrayrr
    1 & 1 \
    0 & 1 \
    endarray
    right)
    = left(
    beginarrayrr
    1 & 1 \
    1 & 2 \
    endarray
    right)
    $$






    share|cite|improve this answer

























      up vote
      0
      down vote













      We can, given a symmetric matrix of integers or rational numbers, construct $P^THP = D$ with rational $P$ and $det P = pm 1.$ If it is imperative to have diagonal elements restricted to $pm 1, 0,$ we may take the final $D$ and construct a further diagonal $F$ with elements the reciprocals of some square roots to get $FDF$ the way you want. In this particular problem, no need.



      Algorithm discussed at http://math.stackexchange.com/questions/1388421/reference-for-linear-algebra-books-that-teach-reverse-hermite-method-for-symmetr
      https://en.wikipedia.org/wiki/Sylvester%27s_law_of_inertia

      $$ H = left(
      beginarrayrr
      1 & 1 \
      1 & 2 \
      endarray
      right)
      $$
      $$ D_0 = H $$
      $$ E_j^T D_j-1 E_j = D_j $$
      $$ P_j-1 E_j = P_j $$
      $$ E_j^-1 Q_j-1 = Q_j $$
      $$ P_j Q_j = Q_j P_j = I $$
      $$ P_j^T H P_j = D_j $$
      $$ Q_j^T D_j Q_j = H $$



      $$ H = left(
      beginarrayrr
      1 & 1 \
      1 & 2 \
      endarray
      right)
      $$



      ==============================================



      $$ E_1 = left(
      beginarrayrr
      1 & - 1 \
      0 & 1 \
      endarray
      right)
      $$
      $$ P_1 = left(
      beginarrayrr
      1 & - 1 \
      0 & 1 \
      endarray
      right)
      , ; ; ; Q_1 = left(
      beginarrayrr
      1 & 1 \
      0 & 1 \
      endarray
      right)
      , ; ; ; D_1 = left(
      beginarrayrr
      1 & 0 \
      0 & 1 \
      endarray
      right)
      $$



      ==============================================



      $$ P^T H P = D $$
      $$left(
      beginarrayrr
      1 & 0 \
      - 1 & 1 \
      endarray
      right)
      left(
      beginarrayrr
      1 & 1 \
      1 & 2 \
      endarray
      right)
      left(
      beginarrayrr
      1 & - 1 \
      0 & 1 \
      endarray
      right)
      = left(
      beginarrayrr
      1 & 0 \
      0 & 1 \
      endarray
      right)
      $$
      $$ Q^T D Q = H $$
      $$left(
      beginarrayrr
      1 & 0 \
      1 & 1 \
      endarray
      right)
      left(
      beginarrayrr
      1 & 0 \
      0 & 1 \
      endarray
      right)
      left(
      beginarrayrr
      1 & 1 \
      0 & 1 \
      endarray
      right)
      = left(
      beginarrayrr
      1 & 1 \
      1 & 2 \
      endarray
      right)
      $$






      share|cite|improve this answer























        up vote
        0
        down vote










        up vote
        0
        down vote









        We can, given a symmetric matrix of integers or rational numbers, construct $P^THP = D$ with rational $P$ and $det P = pm 1.$ If it is imperative to have diagonal elements restricted to $pm 1, 0,$ we may take the final $D$ and construct a further diagonal $F$ with elements the reciprocals of some square roots to get $FDF$ the way you want. In this particular problem, no need.



        Algorithm discussed at http://math.stackexchange.com/questions/1388421/reference-for-linear-algebra-books-that-teach-reverse-hermite-method-for-symmetr
        https://en.wikipedia.org/wiki/Sylvester%27s_law_of_inertia

        $$ H = left(
        beginarrayrr
        1 & 1 \
        1 & 2 \
        endarray
        right)
        $$
        $$ D_0 = H $$
        $$ E_j^T D_j-1 E_j = D_j $$
        $$ P_j-1 E_j = P_j $$
        $$ E_j^-1 Q_j-1 = Q_j $$
        $$ P_j Q_j = Q_j P_j = I $$
        $$ P_j^T H P_j = D_j $$
        $$ Q_j^T D_j Q_j = H $$



        $$ H = left(
        beginarrayrr
        1 & 1 \
        1 & 2 \
        endarray
        right)
        $$



        ==============================================



        $$ E_1 = left(
        beginarrayrr
        1 & - 1 \
        0 & 1 \
        endarray
        right)
        $$
        $$ P_1 = left(
        beginarrayrr
        1 & - 1 \
        0 & 1 \
        endarray
        right)
        , ; ; ; Q_1 = left(
        beginarrayrr
        1 & 1 \
        0 & 1 \
        endarray
        right)
        , ; ; ; D_1 = left(
        beginarrayrr
        1 & 0 \
        0 & 1 \
        endarray
        right)
        $$



        ==============================================



        $$ P^T H P = D $$
        $$left(
        beginarrayrr
        1 & 0 \
        - 1 & 1 \
        endarray
        right)
        left(
        beginarrayrr
        1 & 1 \
        1 & 2 \
        endarray
        right)
        left(
        beginarrayrr
        1 & - 1 \
        0 & 1 \
        endarray
        right)
        = left(
        beginarrayrr
        1 & 0 \
        0 & 1 \
        endarray
        right)
        $$
        $$ Q^T D Q = H $$
        $$left(
        beginarrayrr
        1 & 0 \
        1 & 1 \
        endarray
        right)
        left(
        beginarrayrr
        1 & 0 \
        0 & 1 \
        endarray
        right)
        left(
        beginarrayrr
        1 & 1 \
        0 & 1 \
        endarray
        right)
        = left(
        beginarrayrr
        1 & 1 \
        1 & 2 \
        endarray
        right)
        $$






        share|cite|improve this answer













        We can, given a symmetric matrix of integers or rational numbers, construct $P^THP = D$ with rational $P$ and $det P = pm 1.$ If it is imperative to have diagonal elements restricted to $pm 1, 0,$ we may take the final $D$ and construct a further diagonal $F$ with elements the reciprocals of some square roots to get $FDF$ the way you want. In this particular problem, no need.



        Algorithm discussed at http://math.stackexchange.com/questions/1388421/reference-for-linear-algebra-books-that-teach-reverse-hermite-method-for-symmetr
        https://en.wikipedia.org/wiki/Sylvester%27s_law_of_inertia

        $$ H = left(
        beginarrayrr
        1 & 1 \
        1 & 2 \
        endarray
        right)
        $$
        $$ D_0 = H $$
        $$ E_j^T D_j-1 E_j = D_j $$
        $$ P_j-1 E_j = P_j $$
        $$ E_j^-1 Q_j-1 = Q_j $$
        $$ P_j Q_j = Q_j P_j = I $$
        $$ P_j^T H P_j = D_j $$
        $$ Q_j^T D_j Q_j = H $$



        $$ H = left(
        beginarrayrr
        1 & 1 \
        1 & 2 \
        endarray
        right)
        $$



        ==============================================



        $$ E_1 = left(
        beginarrayrr
        1 & - 1 \
        0 & 1 \
        endarray
        right)
        $$
        $$ P_1 = left(
        beginarrayrr
        1 & - 1 \
        0 & 1 \
        endarray
        right)
        , ; ; ; Q_1 = left(
        beginarrayrr
        1 & 1 \
        0 & 1 \
        endarray
        right)
        , ; ; ; D_1 = left(
        beginarrayrr
        1 & 0 \
        0 & 1 \
        endarray
        right)
        $$



        ==============================================



        $$ P^T H P = D $$
        $$left(
        beginarrayrr
        1 & 0 \
        - 1 & 1 \
        endarray
        right)
        left(
        beginarrayrr
        1 & 1 \
        1 & 2 \
        endarray
        right)
        left(
        beginarrayrr
        1 & - 1 \
        0 & 1 \
        endarray
        right)
        = left(
        beginarrayrr
        1 & 0 \
        0 & 1 \
        endarray
        right)
        $$
        $$ Q^T D Q = H $$
        $$left(
        beginarrayrr
        1 & 0 \
        1 & 1 \
        endarray
        right)
        left(
        beginarrayrr
        1 & 0 \
        0 & 1 \
        endarray
        right)
        left(
        beginarrayrr
        1 & 1 \
        0 & 1 \
        endarray
        right)
        = left(
        beginarrayrr
        1 & 1 \
        1 & 2 \
        endarray
        right)
        $$







        share|cite|improve this answer













        share|cite|improve this answer



        share|cite|improve this answer











        answered 18 hours ago









        Will Jagy

        96.7k594195




        96.7k594195






















             

            draft saved


            draft discarded


























             


            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2872884%2fquadratic-forms-change-of-variables%23new-answer', 'question_page');

            );

            Post as a guest













































































            Comments

            Popular posts from this blog

            What is the equation of a 3D cone with generalised tilt?

            Color the edges and diagonals of a regular polygon

            Relationship between determinant of matrix and determinant of adjoint?