Linear combination of power law distributions

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












I am exploring whether a linear combination of power law distributions is also a power law distribution. Specifically, if $X sim (alpha -1) x_min^(alpha -1) x^-alpha$ and $Y sim (beta -1) y_min^(beta -1)y^-beta$, we are interested in finding the distribution of $Z = rho X + (1-rho) Y, rho in (0, 1)$? ($X$ and $Y$ are independent)



I tried with the moment generating function approach:



If $M_Z(t)$ is the moment generating function of $Z$, $$M_Z(t) = E[e^tZ]=E[e^t(rho X + (1-rho)Y]=E[e^trho X]cdot E[e^t(1-rho)Y] = M_X(trho) M_Y(t(1-rho)).$$



This brings us to the moment generating functions of the power law distributions themselves:



$$M_X(s) = int_x_min^infty e^sx(alpha -1)x_min^(alpha-1) x^-alpha , dx = (alpha - 1)x_min^(alpha-1)int_x_min^infty e^sxx^-alpha , dx. $$



Like it is mentioned here, this resembles the incomplete gamma distribution, but I cannot complete the derivation. And, more importantly, I am not sure if this will help me answer the original question, i.e., is the linear combination of two power law distributions also a power distribution?







share|cite|improve this question

















  • 1




    It depends on what you mean by power law distribution. If you mean Pareto distribution then I would guess the answer would be no. If you mean particular behaviour in the tail (on a log-log scale roughly linear on the right) then I would guess the answer might be yes.
    – Henry
    Jul 21 at 10:39











  • Incidentally, power law distributions have heavy tails and some of their moments are infinite, as are their moment generating functions for all $t gt 0$
    – Henry
    Jul 21 at 10:47










  • @Henry In this case, I do mean the log-log linear behavior in the tail. And you are correct, the moments are defined only for $t<0$ above. However, I am not able to complete the proof using this or through a differentiation of the cumulative distribution function.
    – buzaku
    Jul 21 at 12:05














up vote
0
down vote

favorite












I am exploring whether a linear combination of power law distributions is also a power law distribution. Specifically, if $X sim (alpha -1) x_min^(alpha -1) x^-alpha$ and $Y sim (beta -1) y_min^(beta -1)y^-beta$, we are interested in finding the distribution of $Z = rho X + (1-rho) Y, rho in (0, 1)$? ($X$ and $Y$ are independent)



I tried with the moment generating function approach:



If $M_Z(t)$ is the moment generating function of $Z$, $$M_Z(t) = E[e^tZ]=E[e^t(rho X + (1-rho)Y]=E[e^trho X]cdot E[e^t(1-rho)Y] = M_X(trho) M_Y(t(1-rho)).$$



This brings us to the moment generating functions of the power law distributions themselves:



$$M_X(s) = int_x_min^infty e^sx(alpha -1)x_min^(alpha-1) x^-alpha , dx = (alpha - 1)x_min^(alpha-1)int_x_min^infty e^sxx^-alpha , dx. $$



Like it is mentioned here, this resembles the incomplete gamma distribution, but I cannot complete the derivation. And, more importantly, I am not sure if this will help me answer the original question, i.e., is the linear combination of two power law distributions also a power distribution?







share|cite|improve this question

















  • 1




    It depends on what you mean by power law distribution. If you mean Pareto distribution then I would guess the answer would be no. If you mean particular behaviour in the tail (on a log-log scale roughly linear on the right) then I would guess the answer might be yes.
    – Henry
    Jul 21 at 10:39











  • Incidentally, power law distributions have heavy tails and some of their moments are infinite, as are their moment generating functions for all $t gt 0$
    – Henry
    Jul 21 at 10:47










  • @Henry In this case, I do mean the log-log linear behavior in the tail. And you are correct, the moments are defined only for $t<0$ above. However, I am not able to complete the proof using this or through a differentiation of the cumulative distribution function.
    – buzaku
    Jul 21 at 12:05












up vote
0
down vote

favorite









up vote
0
down vote

favorite











I am exploring whether a linear combination of power law distributions is also a power law distribution. Specifically, if $X sim (alpha -1) x_min^(alpha -1) x^-alpha$ and $Y sim (beta -1) y_min^(beta -1)y^-beta$, we are interested in finding the distribution of $Z = rho X + (1-rho) Y, rho in (0, 1)$? ($X$ and $Y$ are independent)



I tried with the moment generating function approach:



If $M_Z(t)$ is the moment generating function of $Z$, $$M_Z(t) = E[e^tZ]=E[e^t(rho X + (1-rho)Y]=E[e^trho X]cdot E[e^t(1-rho)Y] = M_X(trho) M_Y(t(1-rho)).$$



This brings us to the moment generating functions of the power law distributions themselves:



$$M_X(s) = int_x_min^infty e^sx(alpha -1)x_min^(alpha-1) x^-alpha , dx = (alpha - 1)x_min^(alpha-1)int_x_min^infty e^sxx^-alpha , dx. $$



Like it is mentioned here, this resembles the incomplete gamma distribution, but I cannot complete the derivation. And, more importantly, I am not sure if this will help me answer the original question, i.e., is the linear combination of two power law distributions also a power distribution?







share|cite|improve this question













I am exploring whether a linear combination of power law distributions is also a power law distribution. Specifically, if $X sim (alpha -1) x_min^(alpha -1) x^-alpha$ and $Y sim (beta -1) y_min^(beta -1)y^-beta$, we are interested in finding the distribution of $Z = rho X + (1-rho) Y, rho in (0, 1)$? ($X$ and $Y$ are independent)



I tried with the moment generating function approach:



If $M_Z(t)$ is the moment generating function of $Z$, $$M_Z(t) = E[e^tZ]=E[e^t(rho X + (1-rho)Y]=E[e^trho X]cdot E[e^t(1-rho)Y] = M_X(trho) M_Y(t(1-rho)).$$



This brings us to the moment generating functions of the power law distributions themselves:



$$M_X(s) = int_x_min^infty e^sx(alpha -1)x_min^(alpha-1) x^-alpha , dx = (alpha - 1)x_min^(alpha-1)int_x_min^infty e^sxx^-alpha , dx. $$



Like it is mentioned here, this resembles the incomplete gamma distribution, but I cannot complete the derivation. And, more importantly, I am not sure if this will help me answer the original question, i.e., is the linear combination of two power law distributions also a power distribution?









share|cite|improve this question












share|cite|improve this question




share|cite|improve this question








edited Jul 20 at 4:13









Michael Hardy

204k23186462




204k23186462









asked Jul 20 at 4:10









buzaku

3091212




3091212







  • 1




    It depends on what you mean by power law distribution. If you mean Pareto distribution then I would guess the answer would be no. If you mean particular behaviour in the tail (on a log-log scale roughly linear on the right) then I would guess the answer might be yes.
    – Henry
    Jul 21 at 10:39











  • Incidentally, power law distributions have heavy tails and some of their moments are infinite, as are their moment generating functions for all $t gt 0$
    – Henry
    Jul 21 at 10:47










  • @Henry In this case, I do mean the log-log linear behavior in the tail. And you are correct, the moments are defined only for $t<0$ above. However, I am not able to complete the proof using this or through a differentiation of the cumulative distribution function.
    – buzaku
    Jul 21 at 12:05












  • 1




    It depends on what you mean by power law distribution. If you mean Pareto distribution then I would guess the answer would be no. If you mean particular behaviour in the tail (on a log-log scale roughly linear on the right) then I would guess the answer might be yes.
    – Henry
    Jul 21 at 10:39











  • Incidentally, power law distributions have heavy tails and some of their moments are infinite, as are their moment generating functions for all $t gt 0$
    – Henry
    Jul 21 at 10:47










  • @Henry In this case, I do mean the log-log linear behavior in the tail. And you are correct, the moments are defined only for $t<0$ above. However, I am not able to complete the proof using this or through a differentiation of the cumulative distribution function.
    – buzaku
    Jul 21 at 12:05







1




1




It depends on what you mean by power law distribution. If you mean Pareto distribution then I would guess the answer would be no. If you mean particular behaviour in the tail (on a log-log scale roughly linear on the right) then I would guess the answer might be yes.
– Henry
Jul 21 at 10:39





It depends on what you mean by power law distribution. If you mean Pareto distribution then I would guess the answer would be no. If you mean particular behaviour in the tail (on a log-log scale roughly linear on the right) then I would guess the answer might be yes.
– Henry
Jul 21 at 10:39













Incidentally, power law distributions have heavy tails and some of their moments are infinite, as are their moment generating functions for all $t gt 0$
– Henry
Jul 21 at 10:47




Incidentally, power law distributions have heavy tails and some of their moments are infinite, as are their moment generating functions for all $t gt 0$
– Henry
Jul 21 at 10:47












@Henry In this case, I do mean the log-log linear behavior in the tail. And you are correct, the moments are defined only for $t<0$ above. However, I am not able to complete the proof using this or through a differentiation of the cumulative distribution function.
– buzaku
Jul 21 at 12:05




@Henry In this case, I do mean the log-log linear behavior in the tail. And you are correct, the moments are defined only for $t<0$ above. However, I am not able to complete the proof using this or through a differentiation of the cumulative distribution function.
– buzaku
Jul 21 at 12:05










1 Answer
1






active

oldest

votes

















up vote
0
down vote



accepted










The moment generating function doesn't exist for many heavy tailed distributions. In general if you take suitably normalized linear combinations of power-law distributed random variables with infinite variances (ie with tail parameter $alpha<2$, the generalized version of central limit theorem shows weak convergence to a stable distribution with the same tail parameter. If $alphageq 2$ the linear combinations will tend to converge toward a Gaussian. Note that when $alpha=2$ the variance may still be infinite but convergence to Gaussian is still possible. An example of a heavy tailed stable distribution is the Cauchy. All stable distributions except Gaussian have power law tails. John P. Nolan has some useful notes on his website.






share|cite|improve this answer





















  • Thank you for pointing towards the stable distributions...
    – buzaku
    Aug 3 at 4:27










Your Answer




StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);








 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2857264%2flinear-combination-of-power-law-distributions%23new-answer', 'question_page');

);

Post as a guest






























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
0
down vote



accepted










The moment generating function doesn't exist for many heavy tailed distributions. In general if you take suitably normalized linear combinations of power-law distributed random variables with infinite variances (ie with tail parameter $alpha<2$, the generalized version of central limit theorem shows weak convergence to a stable distribution with the same tail parameter. If $alphageq 2$ the linear combinations will tend to converge toward a Gaussian. Note that when $alpha=2$ the variance may still be infinite but convergence to Gaussian is still possible. An example of a heavy tailed stable distribution is the Cauchy. All stable distributions except Gaussian have power law tails. John P. Nolan has some useful notes on his website.






share|cite|improve this answer





















  • Thank you for pointing towards the stable distributions...
    – buzaku
    Aug 3 at 4:27














up vote
0
down vote



accepted










The moment generating function doesn't exist for many heavy tailed distributions. In general if you take suitably normalized linear combinations of power-law distributed random variables with infinite variances (ie with tail parameter $alpha<2$, the generalized version of central limit theorem shows weak convergence to a stable distribution with the same tail parameter. If $alphageq 2$ the linear combinations will tend to converge toward a Gaussian. Note that when $alpha=2$ the variance may still be infinite but convergence to Gaussian is still possible. An example of a heavy tailed stable distribution is the Cauchy. All stable distributions except Gaussian have power law tails. John P. Nolan has some useful notes on his website.






share|cite|improve this answer





















  • Thank you for pointing towards the stable distributions...
    – buzaku
    Aug 3 at 4:27












up vote
0
down vote



accepted







up vote
0
down vote



accepted






The moment generating function doesn't exist for many heavy tailed distributions. In general if you take suitably normalized linear combinations of power-law distributed random variables with infinite variances (ie with tail parameter $alpha<2$, the generalized version of central limit theorem shows weak convergence to a stable distribution with the same tail parameter. If $alphageq 2$ the linear combinations will tend to converge toward a Gaussian. Note that when $alpha=2$ the variance may still be infinite but convergence to Gaussian is still possible. An example of a heavy tailed stable distribution is the Cauchy. All stable distributions except Gaussian have power law tails. John P. Nolan has some useful notes on his website.






share|cite|improve this answer













The moment generating function doesn't exist for many heavy tailed distributions. In general if you take suitably normalized linear combinations of power-law distributed random variables with infinite variances (ie with tail parameter $alpha<2$, the generalized version of central limit theorem shows weak convergence to a stable distribution with the same tail parameter. If $alphageq 2$ the linear combinations will tend to converge toward a Gaussian. Note that when $alpha=2$ the variance may still be infinite but convergence to Gaussian is still possible. An example of a heavy tailed stable distribution is the Cauchy. All stable distributions except Gaussian have power law tails. John P. Nolan has some useful notes on his website.







share|cite|improve this answer













share|cite|improve this answer



share|cite|improve this answer











answered Aug 1 at 14:10









Will Townes

1534




1534











  • Thank you for pointing towards the stable distributions...
    – buzaku
    Aug 3 at 4:27
















  • Thank you for pointing towards the stable distributions...
    – buzaku
    Aug 3 at 4:27















Thank you for pointing towards the stable distributions...
– buzaku
Aug 3 at 4:27




Thank you for pointing towards the stable distributions...
– buzaku
Aug 3 at 4:27












 

draft saved


draft discarded


























 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2857264%2flinear-combination-of-power-law-distributions%23new-answer', 'question_page');

);

Post as a guest













































































Comments

Popular posts from this blog

What is the equation of a 3D cone with generalised tilt?

Color the edges and diagonals of a regular polygon

Relationship between determinant of matrix and determinant of adjoint?