Why does maximum likelihood fail in this simple case?
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
$defemathrme$
Suppose one decides to parametrize the exponential probability density in this unorthodox way
$$
f(x; ø) = -ø e^øx, quad x > 0
$$
where $ø in (-âÂÂ, 0)$.
Then putting the derivative of the log-likelihood with one observation equal to zero gives
$$frac-1theta + x = 0 iff frac1x = theta $$
this would not be a good estimator of $theta$ since $1/x > 0$.
From the second derivative of the log-likelihood $frac1theta^2 > 0$ we see that we are in presence of a minimum instead of a maximum.
Where did the maximum likelihood procedure go wrong? was it an error in my calculations? the fact that the parameter support is not compact? (but in the ordinary parametrization everything works)
For the existence it should be sufficient that the parameter space is compact and the Likelihood function is continuous on the parameter space.
probability statistics
add a comment |Â
up vote
1
down vote
favorite
$defemathrme$
Suppose one decides to parametrize the exponential probability density in this unorthodox way
$$
f(x; ø) = -ø e^øx, quad x > 0
$$
where $ø in (-âÂÂ, 0)$.
Then putting the derivative of the log-likelihood with one observation equal to zero gives
$$frac-1theta + x = 0 iff frac1x = theta $$
this would not be a good estimator of $theta$ since $1/x > 0$.
From the second derivative of the log-likelihood $frac1theta^2 > 0$ we see that we are in presence of a minimum instead of a maximum.
Where did the maximum likelihood procedure go wrong? was it an error in my calculations? the fact that the parameter support is not compact? (but in the ordinary parametrization everything works)
For the existence it should be sufficient that the parameter space is compact and the Likelihood function is continuous on the parameter space.
probability statistics
1
If $theta$ is negative and $frac1x=theta$ , then $frac1x$ is negative as well.
â Peter
Jul 22 at 13:09
@Peter but $x>0$, the mistake is I found the critical point without accounting for the restriction? How could I fix this?
â Monolite
Jul 22 at 13:20
With a single observation being $0$, the maximum likelihood will occur when the rate is infinite, i.e. when your $theta=-infty$, and this need not be at a zero of the derivative of the log-likelihood
â Henry
Jul 22 at 17:55
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
$defemathrme$
Suppose one decides to parametrize the exponential probability density in this unorthodox way
$$
f(x; ø) = -ø e^øx, quad x > 0
$$
where $ø in (-âÂÂ, 0)$.
Then putting the derivative of the log-likelihood with one observation equal to zero gives
$$frac-1theta + x = 0 iff frac1x = theta $$
this would not be a good estimator of $theta$ since $1/x > 0$.
From the second derivative of the log-likelihood $frac1theta^2 > 0$ we see that we are in presence of a minimum instead of a maximum.
Where did the maximum likelihood procedure go wrong? was it an error in my calculations? the fact that the parameter support is not compact? (but in the ordinary parametrization everything works)
For the existence it should be sufficient that the parameter space is compact and the Likelihood function is continuous on the parameter space.
probability statistics
$defemathrme$
Suppose one decides to parametrize the exponential probability density in this unorthodox way
$$
f(x; ø) = -ø e^øx, quad x > 0
$$
where $ø in (-âÂÂ, 0)$.
Then putting the derivative of the log-likelihood with one observation equal to zero gives
$$frac-1theta + x = 0 iff frac1x = theta $$
this would not be a good estimator of $theta$ since $1/x > 0$.
From the second derivative of the log-likelihood $frac1theta^2 > 0$ we see that we are in presence of a minimum instead of a maximum.
Where did the maximum likelihood procedure go wrong? was it an error in my calculations? the fact that the parameter support is not compact? (but in the ordinary parametrization everything works)
For the existence it should be sufficient that the parameter space is compact and the Likelihood function is continuous on the parameter space.
probability statistics
edited Jul 22 at 13:27
Did
242k23208443
242k23208443
asked Jul 22 at 13:05
Monolite
1,4432922
1,4432922
1
If $theta$ is negative and $frac1x=theta$ , then $frac1x$ is negative as well.
â Peter
Jul 22 at 13:09
@Peter but $x>0$, the mistake is I found the critical point without accounting for the restriction? How could I fix this?
â Monolite
Jul 22 at 13:20
With a single observation being $0$, the maximum likelihood will occur when the rate is infinite, i.e. when your $theta=-infty$, and this need not be at a zero of the derivative of the log-likelihood
â Henry
Jul 22 at 17:55
add a comment |Â
1
If $theta$ is negative and $frac1x=theta$ , then $frac1x$ is negative as well.
â Peter
Jul 22 at 13:09
@Peter but $x>0$, the mistake is I found the critical point without accounting for the restriction? How could I fix this?
â Monolite
Jul 22 at 13:20
With a single observation being $0$, the maximum likelihood will occur when the rate is infinite, i.e. when your $theta=-infty$, and this need not be at a zero of the derivative of the log-likelihood
â Henry
Jul 22 at 17:55
1
1
If $theta$ is negative and $frac1x=theta$ , then $frac1x$ is negative as well.
â Peter
Jul 22 at 13:09
If $theta$ is negative and $frac1x=theta$ , then $frac1x$ is negative as well.
â Peter
Jul 22 at 13:09
@Peter but $x>0$, the mistake is I found the critical point without accounting for the restriction? How could I fix this?
â Monolite
Jul 22 at 13:20
@Peter but $x>0$, the mistake is I found the critical point without accounting for the restriction? How could I fix this?
â Monolite
Jul 22 at 13:20
With a single observation being $0$, the maximum likelihood will occur when the rate is infinite, i.e. when your $theta=-infty$, and this need not be at a zero of the derivative of the log-likelihood
â Henry
Jul 22 at 17:55
With a single observation being $0$, the maximum likelihood will occur when the rate is infinite, i.e. when your $theta=-infty$, and this need not be at a zero of the derivative of the log-likelihood
â Henry
Jul 22 at 17:55
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
2
down vote
accepted
The problem is that you've used
$$
fracmathrm dmathrm dthetalog(-theta)=frac-1theta;,
$$
whereas
$$
fracmathrm dmathrm dthetalog(-theta)=-left.fracmathrm dmathrm dxlog x,right|_x=-theta=-left.frac 1xright|_x=-theta=-frac1-theta=frac1theta;.
$$
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
accepted
The problem is that you've used
$$
fracmathrm dmathrm dthetalog(-theta)=frac-1theta;,
$$
whereas
$$
fracmathrm dmathrm dthetalog(-theta)=-left.fracmathrm dmathrm dxlog x,right|_x=-theta=-left.frac 1xright|_x=-theta=-frac1-theta=frac1theta;.
$$
add a comment |Â
up vote
2
down vote
accepted
The problem is that you've used
$$
fracmathrm dmathrm dthetalog(-theta)=frac-1theta;,
$$
whereas
$$
fracmathrm dmathrm dthetalog(-theta)=-left.fracmathrm dmathrm dxlog x,right|_x=-theta=-left.frac 1xright|_x=-theta=-frac1-theta=frac1theta;.
$$
add a comment |Â
up vote
2
down vote
accepted
up vote
2
down vote
accepted
The problem is that you've used
$$
fracmathrm dmathrm dthetalog(-theta)=frac-1theta;,
$$
whereas
$$
fracmathrm dmathrm dthetalog(-theta)=-left.fracmathrm dmathrm dxlog x,right|_x=-theta=-left.frac 1xright|_x=-theta=-frac1-theta=frac1theta;.
$$
The problem is that you've used
$$
fracmathrm dmathrm dthetalog(-theta)=frac-1theta;,
$$
whereas
$$
fracmathrm dmathrm dthetalog(-theta)=-left.fracmathrm dmathrm dxlog x,right|_x=-theta=-left.frac 1xright|_x=-theta=-frac1-theta=frac1theta;.
$$
answered Jul 22 at 13:22
joriki
164k10180328
164k10180328
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2859375%2fwhy-does-maximum-likelihood-fail-in-this-simple-case%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
If $theta$ is negative and $frac1x=theta$ , then $frac1x$ is negative as well.
â Peter
Jul 22 at 13:09
@Peter but $x>0$, the mistake is I found the critical point without accounting for the restriction? How could I fix this?
â Monolite
Jul 22 at 13:20
With a single observation being $0$, the maximum likelihood will occur when the rate is infinite, i.e. when your $theta=-infty$, and this need not be at a zero of the derivative of the log-likelihood
â Henry
Jul 22 at 17:55