Why does maximum likelihood fail in this simple case?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












$defemathrme$
Suppose one decides to parametrize the exponential probability density in this unorthodox way



$$
f(x; θ) = -θ e^θx, quad x > 0
$$
where $θ in (-∞, 0)$.



Then putting the derivative of the log-likelihood with one observation equal to zero gives



$$frac-1theta + x = 0 iff frac1x = theta $$



this would not be a good estimator of $theta$ since $1/x > 0$.



From the second derivative of the log-likelihood $frac1theta^2 > 0$ we see that we are in presence of a minimum instead of a maximum.



Where did the maximum likelihood procedure go wrong? was it an error in my calculations? the fact that the parameter support is not compact? (but in the ordinary parametrization everything works)



For the existence it should be sufficient that the parameter space is compact and the Likelihood function is continuous on the parameter space.







share|cite|improve this question

















  • 1




    If $theta$ is negative and $frac1x=theta$ , then $frac1x$ is negative as well.
    – Peter
    Jul 22 at 13:09










  • @Peter but $x>0$, the mistake is I found the critical point without accounting for the restriction? How could I fix this?
    – Monolite
    Jul 22 at 13:20










  • With a single observation being $0$, the maximum likelihood will occur when the rate is infinite, i.e. when your $theta=-infty$, and this need not be at a zero of the derivative of the log-likelihood
    – Henry
    Jul 22 at 17:55














up vote
1
down vote

favorite












$defemathrme$
Suppose one decides to parametrize the exponential probability density in this unorthodox way



$$
f(x; θ) = -θ e^θx, quad x > 0
$$
where $θ in (-∞, 0)$.



Then putting the derivative of the log-likelihood with one observation equal to zero gives



$$frac-1theta + x = 0 iff frac1x = theta $$



this would not be a good estimator of $theta$ since $1/x > 0$.



From the second derivative of the log-likelihood $frac1theta^2 > 0$ we see that we are in presence of a minimum instead of a maximum.



Where did the maximum likelihood procedure go wrong? was it an error in my calculations? the fact that the parameter support is not compact? (but in the ordinary parametrization everything works)



For the existence it should be sufficient that the parameter space is compact and the Likelihood function is continuous on the parameter space.







share|cite|improve this question

















  • 1




    If $theta$ is negative and $frac1x=theta$ , then $frac1x$ is negative as well.
    – Peter
    Jul 22 at 13:09










  • @Peter but $x>0$, the mistake is I found the critical point without accounting for the restriction? How could I fix this?
    – Monolite
    Jul 22 at 13:20










  • With a single observation being $0$, the maximum likelihood will occur when the rate is infinite, i.e. when your $theta=-infty$, and this need not be at a zero of the derivative of the log-likelihood
    – Henry
    Jul 22 at 17:55












up vote
1
down vote

favorite









up vote
1
down vote

favorite











$defemathrme$
Suppose one decides to parametrize the exponential probability density in this unorthodox way



$$
f(x; θ) = -θ e^θx, quad x > 0
$$
where $θ in (-∞, 0)$.



Then putting the derivative of the log-likelihood with one observation equal to zero gives



$$frac-1theta + x = 0 iff frac1x = theta $$



this would not be a good estimator of $theta$ since $1/x > 0$.



From the second derivative of the log-likelihood $frac1theta^2 > 0$ we see that we are in presence of a minimum instead of a maximum.



Where did the maximum likelihood procedure go wrong? was it an error in my calculations? the fact that the parameter support is not compact? (but in the ordinary parametrization everything works)



For the existence it should be sufficient that the parameter space is compact and the Likelihood function is continuous on the parameter space.







share|cite|improve this question













$defemathrme$
Suppose one decides to parametrize the exponential probability density in this unorthodox way



$$
f(x; θ) = -θ e^θx, quad x > 0
$$
where $θ in (-∞, 0)$.



Then putting the derivative of the log-likelihood with one observation equal to zero gives



$$frac-1theta + x = 0 iff frac1x = theta $$



this would not be a good estimator of $theta$ since $1/x > 0$.



From the second derivative of the log-likelihood $frac1theta^2 > 0$ we see that we are in presence of a minimum instead of a maximum.



Where did the maximum likelihood procedure go wrong? was it an error in my calculations? the fact that the parameter support is not compact? (but in the ordinary parametrization everything works)



For the existence it should be sufficient that the parameter space is compact and the Likelihood function is continuous on the parameter space.









share|cite|improve this question












share|cite|improve this question




share|cite|improve this question








edited Jul 22 at 13:27









Did

242k23208443




242k23208443









asked Jul 22 at 13:05









Monolite

1,4432922




1,4432922







  • 1




    If $theta$ is negative and $frac1x=theta$ , then $frac1x$ is negative as well.
    – Peter
    Jul 22 at 13:09










  • @Peter but $x>0$, the mistake is I found the critical point without accounting for the restriction? How could I fix this?
    – Monolite
    Jul 22 at 13:20










  • With a single observation being $0$, the maximum likelihood will occur when the rate is infinite, i.e. when your $theta=-infty$, and this need not be at a zero of the derivative of the log-likelihood
    – Henry
    Jul 22 at 17:55












  • 1




    If $theta$ is negative and $frac1x=theta$ , then $frac1x$ is negative as well.
    – Peter
    Jul 22 at 13:09










  • @Peter but $x>0$, the mistake is I found the critical point without accounting for the restriction? How could I fix this?
    – Monolite
    Jul 22 at 13:20










  • With a single observation being $0$, the maximum likelihood will occur when the rate is infinite, i.e. when your $theta=-infty$, and this need not be at a zero of the derivative of the log-likelihood
    – Henry
    Jul 22 at 17:55







1




1




If $theta$ is negative and $frac1x=theta$ , then $frac1x$ is negative as well.
– Peter
Jul 22 at 13:09




If $theta$ is negative and $frac1x=theta$ , then $frac1x$ is negative as well.
– Peter
Jul 22 at 13:09












@Peter but $x>0$, the mistake is I found the critical point without accounting for the restriction? How could I fix this?
– Monolite
Jul 22 at 13:20




@Peter but $x>0$, the mistake is I found the critical point without accounting for the restriction? How could I fix this?
– Monolite
Jul 22 at 13:20












With a single observation being $0$, the maximum likelihood will occur when the rate is infinite, i.e. when your $theta=-infty$, and this need not be at a zero of the derivative of the log-likelihood
– Henry
Jul 22 at 17:55




With a single observation being $0$, the maximum likelihood will occur when the rate is infinite, i.e. when your $theta=-infty$, and this need not be at a zero of the derivative of the log-likelihood
– Henry
Jul 22 at 17:55










1 Answer
1






active

oldest

votes

















up vote
2
down vote



accepted










The problem is that you've used



$$
fracmathrm dmathrm dthetalog(-theta)=frac-1theta;,
$$



whereas



$$
fracmathrm dmathrm dthetalog(-theta)=-left.fracmathrm dmathrm dxlog x,right|_x=-theta=-left.frac 1xright|_x=-theta=-frac1-theta=frac1theta;.
$$






share|cite|improve this answer





















    Your Answer




    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "69"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: false,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );








     

    draft saved


    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2859375%2fwhy-does-maximum-likelihood-fail-in-this-simple-case%23new-answer', 'question_page');

    );

    Post as a guest






























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    2
    down vote



    accepted










    The problem is that you've used



    $$
    fracmathrm dmathrm dthetalog(-theta)=frac-1theta;,
    $$



    whereas



    $$
    fracmathrm dmathrm dthetalog(-theta)=-left.fracmathrm dmathrm dxlog x,right|_x=-theta=-left.frac 1xright|_x=-theta=-frac1-theta=frac1theta;.
    $$






    share|cite|improve this answer

























      up vote
      2
      down vote



      accepted










      The problem is that you've used



      $$
      fracmathrm dmathrm dthetalog(-theta)=frac-1theta;,
      $$



      whereas



      $$
      fracmathrm dmathrm dthetalog(-theta)=-left.fracmathrm dmathrm dxlog x,right|_x=-theta=-left.frac 1xright|_x=-theta=-frac1-theta=frac1theta;.
      $$






      share|cite|improve this answer























        up vote
        2
        down vote



        accepted







        up vote
        2
        down vote



        accepted






        The problem is that you've used



        $$
        fracmathrm dmathrm dthetalog(-theta)=frac-1theta;,
        $$



        whereas



        $$
        fracmathrm dmathrm dthetalog(-theta)=-left.fracmathrm dmathrm dxlog x,right|_x=-theta=-left.frac 1xright|_x=-theta=-frac1-theta=frac1theta;.
        $$






        share|cite|improve this answer













        The problem is that you've used



        $$
        fracmathrm dmathrm dthetalog(-theta)=frac-1theta;,
        $$



        whereas



        $$
        fracmathrm dmathrm dthetalog(-theta)=-left.fracmathrm dmathrm dxlog x,right|_x=-theta=-left.frac 1xright|_x=-theta=-frac1-theta=frac1theta;.
        $$







        share|cite|improve this answer













        share|cite|improve this answer



        share|cite|improve this answer











        answered Jul 22 at 13:22









        joriki

        164k10180328




        164k10180328






















             

            draft saved


            draft discarded


























             


            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2859375%2fwhy-does-maximum-likelihood-fail-in-this-simple-case%23new-answer', 'question_page');

            );

            Post as a guest













































































            Comments

            Popular posts from this blog

            What is the equation of a 3D cone with generalised tilt?

            Relationship between determinant of matrix and determinant of adjoint?

            Color the edges and diagonals of a regular polygon