Linear combination of power law distributions
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
I am exploring whether a linear combination of power law distributions is also a power law distribution. Specifically, if $X sim (alpha -1) x_min^(alpha -1) x^-alpha$ and $Y sim (beta -1) y_min^(beta -1)y^-beta$, we are interested in finding the distribution of $Z = rho X + (1-rho) Y, rho in (0, 1)$? ($X$ and $Y$ are independent)
I tried with the moment generating function approach:
If $M_Z(t)$ is the moment generating function of $Z$, $$M_Z(t) = E[e^tZ]=E[e^t(rho X + (1-rho)Y]=E[e^trho X]cdot E[e^t(1-rho)Y] = M_X(trho) M_Y(t(1-rho)).$$
This brings us to the moment generating functions of the power law distributions themselves:
$$M_X(s) = int_x_min^infty e^sx(alpha -1)x_min^(alpha-1) x^-alpha , dx = (alpha - 1)x_min^(alpha-1)int_x_min^infty e^sxx^-alpha , dx. $$
Like it is mentioned here, this resembles the incomplete gamma distribution, but I cannot complete the derivation. And, more importantly, I am not sure if this will help me answer the original question, i.e., is the linear combination of two power law distributions also a power distribution?
probability probability-theory probability-distributions moment-generating-functions
add a comment |Â
up vote
0
down vote
favorite
I am exploring whether a linear combination of power law distributions is also a power law distribution. Specifically, if $X sim (alpha -1) x_min^(alpha -1) x^-alpha$ and $Y sim (beta -1) y_min^(beta -1)y^-beta$, we are interested in finding the distribution of $Z = rho X + (1-rho) Y, rho in (0, 1)$? ($X$ and $Y$ are independent)
I tried with the moment generating function approach:
If $M_Z(t)$ is the moment generating function of $Z$, $$M_Z(t) = E[e^tZ]=E[e^t(rho X + (1-rho)Y]=E[e^trho X]cdot E[e^t(1-rho)Y] = M_X(trho) M_Y(t(1-rho)).$$
This brings us to the moment generating functions of the power law distributions themselves:
$$M_X(s) = int_x_min^infty e^sx(alpha -1)x_min^(alpha-1) x^-alpha , dx = (alpha - 1)x_min^(alpha-1)int_x_min^infty e^sxx^-alpha , dx. $$
Like it is mentioned here, this resembles the incomplete gamma distribution, but I cannot complete the derivation. And, more importantly, I am not sure if this will help me answer the original question, i.e., is the linear combination of two power law distributions also a power distribution?
probability probability-theory probability-distributions moment-generating-functions
1
It depends on what you mean by power law distribution. If you mean Pareto distribution then I would guess the answer would be no. If you mean particular behaviour in the tail (on a log-log scale roughly linear on the right) then I would guess the answer might be yes.
– Henry
Jul 21 at 10:39
Incidentally, power law distributions have heavy tails and some of their moments are infinite, as are their moment generating functions for all $t gt 0$
– Henry
Jul 21 at 10:47
@Henry In this case, I do mean the log-log linear behavior in the tail. And you are correct, the moments are defined only for $t<0$ above. However, I am not able to complete the proof using this or through a differentiation of the cumulative distribution function.
– buzaku
Jul 21 at 12:05
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I am exploring whether a linear combination of power law distributions is also a power law distribution. Specifically, if $X sim (alpha -1) x_min^(alpha -1) x^-alpha$ and $Y sim (beta -1) y_min^(beta -1)y^-beta$, we are interested in finding the distribution of $Z = rho X + (1-rho) Y, rho in (0, 1)$? ($X$ and $Y$ are independent)
I tried with the moment generating function approach:
If $M_Z(t)$ is the moment generating function of $Z$, $$M_Z(t) = E[e^tZ]=E[e^t(rho X + (1-rho)Y]=E[e^trho X]cdot E[e^t(1-rho)Y] = M_X(trho) M_Y(t(1-rho)).$$
This brings us to the moment generating functions of the power law distributions themselves:
$$M_X(s) = int_x_min^infty e^sx(alpha -1)x_min^(alpha-1) x^-alpha , dx = (alpha - 1)x_min^(alpha-1)int_x_min^infty e^sxx^-alpha , dx. $$
Like it is mentioned here, this resembles the incomplete gamma distribution, but I cannot complete the derivation. And, more importantly, I am not sure if this will help me answer the original question, i.e., is the linear combination of two power law distributions also a power distribution?
probability probability-theory probability-distributions moment-generating-functions
I am exploring whether a linear combination of power law distributions is also a power law distribution. Specifically, if $X sim (alpha -1) x_min^(alpha -1) x^-alpha$ and $Y sim (beta -1) y_min^(beta -1)y^-beta$, we are interested in finding the distribution of $Z = rho X + (1-rho) Y, rho in (0, 1)$? ($X$ and $Y$ are independent)
I tried with the moment generating function approach:
If $M_Z(t)$ is the moment generating function of $Z$, $$M_Z(t) = E[e^tZ]=E[e^t(rho X + (1-rho)Y]=E[e^trho X]cdot E[e^t(1-rho)Y] = M_X(trho) M_Y(t(1-rho)).$$
This brings us to the moment generating functions of the power law distributions themselves:
$$M_X(s) = int_x_min^infty e^sx(alpha -1)x_min^(alpha-1) x^-alpha , dx = (alpha - 1)x_min^(alpha-1)int_x_min^infty e^sxx^-alpha , dx. $$
Like it is mentioned here, this resembles the incomplete gamma distribution, but I cannot complete the derivation. And, more importantly, I am not sure if this will help me answer the original question, i.e., is the linear combination of two power law distributions also a power distribution?
probability probability-theory probability-distributions moment-generating-functions
edited Jul 20 at 4:13
Michael Hardy
204k23186462
204k23186462
asked Jul 20 at 4:10
buzaku
3091212
3091212
1
It depends on what you mean by power law distribution. If you mean Pareto distribution then I would guess the answer would be no. If you mean particular behaviour in the tail (on a log-log scale roughly linear on the right) then I would guess the answer might be yes.
– Henry
Jul 21 at 10:39
Incidentally, power law distributions have heavy tails and some of their moments are infinite, as are their moment generating functions for all $t gt 0$
– Henry
Jul 21 at 10:47
@Henry In this case, I do mean the log-log linear behavior in the tail. And you are correct, the moments are defined only for $t<0$ above. However, I am not able to complete the proof using this or through a differentiation of the cumulative distribution function.
– buzaku
Jul 21 at 12:05
add a comment |Â
1
It depends on what you mean by power law distribution. If you mean Pareto distribution then I would guess the answer would be no. If you mean particular behaviour in the tail (on a log-log scale roughly linear on the right) then I would guess the answer might be yes.
– Henry
Jul 21 at 10:39
Incidentally, power law distributions have heavy tails and some of their moments are infinite, as are their moment generating functions for all $t gt 0$
– Henry
Jul 21 at 10:47
@Henry In this case, I do mean the log-log linear behavior in the tail. And you are correct, the moments are defined only for $t<0$ above. However, I am not able to complete the proof using this or through a differentiation of the cumulative distribution function.
– buzaku
Jul 21 at 12:05
1
1
It depends on what you mean by power law distribution. If you mean Pareto distribution then I would guess the answer would be no. If you mean particular behaviour in the tail (on a log-log scale roughly linear on the right) then I would guess the answer might be yes.
– Henry
Jul 21 at 10:39
It depends on what you mean by power law distribution. If you mean Pareto distribution then I would guess the answer would be no. If you mean particular behaviour in the tail (on a log-log scale roughly linear on the right) then I would guess the answer might be yes.
– Henry
Jul 21 at 10:39
Incidentally, power law distributions have heavy tails and some of their moments are infinite, as are their moment generating functions for all $t gt 0$
– Henry
Jul 21 at 10:47
Incidentally, power law distributions have heavy tails and some of their moments are infinite, as are their moment generating functions for all $t gt 0$
– Henry
Jul 21 at 10:47
@Henry In this case, I do mean the log-log linear behavior in the tail. And you are correct, the moments are defined only for $t<0$ above. However, I am not able to complete the proof using this or through a differentiation of the cumulative distribution function.
– buzaku
Jul 21 at 12:05
@Henry In this case, I do mean the log-log linear behavior in the tail. And you are correct, the moments are defined only for $t<0$ above. However, I am not able to complete the proof using this or through a differentiation of the cumulative distribution function.
– buzaku
Jul 21 at 12:05
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
0
down vote
accepted
The moment generating function doesn't exist for many heavy tailed distributions. In general if you take suitably normalized linear combinations of power-law distributed random variables with infinite variances (ie with tail parameter $alpha<2$, the generalized version of central limit theorem shows weak convergence to a stable distribution with the same tail parameter. If $alphageq 2$ the linear combinations will tend to converge toward a Gaussian. Note that when $alpha=2$ the variance may still be infinite but convergence to Gaussian is still possible. An example of a heavy tailed stable distribution is the Cauchy. All stable distributions except Gaussian have power law tails. John P. Nolan has some useful notes on his website.
Thank you for pointing towards the stable distributions...
– buzaku
Aug 3 at 4:27
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
accepted
The moment generating function doesn't exist for many heavy tailed distributions. In general if you take suitably normalized linear combinations of power-law distributed random variables with infinite variances (ie with tail parameter $alpha<2$, the generalized version of central limit theorem shows weak convergence to a stable distribution with the same tail parameter. If $alphageq 2$ the linear combinations will tend to converge toward a Gaussian. Note that when $alpha=2$ the variance may still be infinite but convergence to Gaussian is still possible. An example of a heavy tailed stable distribution is the Cauchy. All stable distributions except Gaussian have power law tails. John P. Nolan has some useful notes on his website.
Thank you for pointing towards the stable distributions...
– buzaku
Aug 3 at 4:27
add a comment |Â
up vote
0
down vote
accepted
The moment generating function doesn't exist for many heavy tailed distributions. In general if you take suitably normalized linear combinations of power-law distributed random variables with infinite variances (ie with tail parameter $alpha<2$, the generalized version of central limit theorem shows weak convergence to a stable distribution with the same tail parameter. If $alphageq 2$ the linear combinations will tend to converge toward a Gaussian. Note that when $alpha=2$ the variance may still be infinite but convergence to Gaussian is still possible. An example of a heavy tailed stable distribution is the Cauchy. All stable distributions except Gaussian have power law tails. John P. Nolan has some useful notes on his website.
Thank you for pointing towards the stable distributions...
– buzaku
Aug 3 at 4:27
add a comment |Â
up vote
0
down vote
accepted
up vote
0
down vote
accepted
The moment generating function doesn't exist for many heavy tailed distributions. In general if you take suitably normalized linear combinations of power-law distributed random variables with infinite variances (ie with tail parameter $alpha<2$, the generalized version of central limit theorem shows weak convergence to a stable distribution with the same tail parameter. If $alphageq 2$ the linear combinations will tend to converge toward a Gaussian. Note that when $alpha=2$ the variance may still be infinite but convergence to Gaussian is still possible. An example of a heavy tailed stable distribution is the Cauchy. All stable distributions except Gaussian have power law tails. John P. Nolan has some useful notes on his website.
The moment generating function doesn't exist for many heavy tailed distributions. In general if you take suitably normalized linear combinations of power-law distributed random variables with infinite variances (ie with tail parameter $alpha<2$, the generalized version of central limit theorem shows weak convergence to a stable distribution with the same tail parameter. If $alphageq 2$ the linear combinations will tend to converge toward a Gaussian. Note that when $alpha=2$ the variance may still be infinite but convergence to Gaussian is still possible. An example of a heavy tailed stable distribution is the Cauchy. All stable distributions except Gaussian have power law tails. John P. Nolan has some useful notes on his website.
answered Aug 1 at 14:10
Will Townes
1534
1534
Thank you for pointing towards the stable distributions...
– buzaku
Aug 3 at 4:27
add a comment |Â
Thank you for pointing towards the stable distributions...
– buzaku
Aug 3 at 4:27
Thank you for pointing towards the stable distributions...
– buzaku
Aug 3 at 4:27
Thank you for pointing towards the stable distributions...
– buzaku
Aug 3 at 4:27
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2857264%2flinear-combination-of-power-law-distributions%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
It depends on what you mean by power law distribution. If you mean Pareto distribution then I would guess the answer would be no. If you mean particular behaviour in the tail (on a log-log scale roughly linear on the right) then I would guess the answer might be yes.
– Henry
Jul 21 at 10:39
Incidentally, power law distributions have heavy tails and some of their moments are infinite, as are their moment generating functions for all $t gt 0$
– Henry
Jul 21 at 10:47
@Henry In this case, I do mean the log-log linear behavior in the tail. And you are correct, the moments are defined only for $t<0$ above. However, I am not able to complete the proof using this or through a differentiation of the cumulative distribution function.
– buzaku
Jul 21 at 12:05