Help showing aMarkov chain with a doubly-stochastic matrix has uniform limiting distribution
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
I have a lot of difficulty with proofs; could someone help me with this question that really can not solve? I would also like some indication of material to get through with this kind of question and some hint of material about Markov chain. Thanks in advance.
"A stochastic matrix is called doubly stochastic if its columns sum to 1. Let
$X_0
, X_1, dots$ be a Markov chain on $1,dots, k$ with a doubly stochastic transition
matrix and initial distribution that is uniform on $1, dots, k.$ Show that the distribution of $X_n$ is uniform on $1.dots, k,$ for all $n ge 0."$
probability markov-chains
add a comment |Â
up vote
0
down vote
favorite
I have a lot of difficulty with proofs; could someone help me with this question that really can not solve? I would also like some indication of material to get through with this kind of question and some hint of material about Markov chain. Thanks in advance.
"A stochastic matrix is called doubly stochastic if its columns sum to 1. Let
$X_0
, X_1, dots$ be a Markov chain on $1,dots, k$ with a doubly stochastic transition
matrix and initial distribution that is uniform on $1, dots, k.$ Show that the distribution of $X_n$ is uniform on $1.dots, k,$ for all $n ge 0."$
probability markov-chains
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I have a lot of difficulty with proofs; could someone help me with this question that really can not solve? I would also like some indication of material to get through with this kind of question and some hint of material about Markov chain. Thanks in advance.
"A stochastic matrix is called doubly stochastic if its columns sum to 1. Let
$X_0
, X_1, dots$ be a Markov chain on $1,dots, k$ with a doubly stochastic transition
matrix and initial distribution that is uniform on $1, dots, k.$ Show that the distribution of $X_n$ is uniform on $1.dots, k,$ for all $n ge 0."$
probability markov-chains
I have a lot of difficulty with proofs; could someone help me with this question that really can not solve? I would also like some indication of material to get through with this kind of question and some hint of material about Markov chain. Thanks in advance.
"A stochastic matrix is called doubly stochastic if its columns sum to 1. Let
$X_0
, X_1, dots$ be a Markov chain on $1,dots, k$ with a doubly stochastic transition
matrix and initial distribution that is uniform on $1, dots, k.$ Show that the distribution of $X_n$ is uniform on $1.dots, k,$ for all $n ge 0."$
probability markov-chains
edited Jul 15 at 6:08
BruceET
33.3k61440
33.3k61440
asked Jul 14 at 23:34
Lucas Oliveira Freitas
194
194
add a comment |Â
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
2
down vote
$X_1=X_0P$, where $P$ is the transition matrix. As $X_0=[1/k,ldots,1/k]$, one would have $X_1^i=frac1k(P_2i+P_1i+ldots+P_ki)=frac1k$ by the double stochastic property (sum of the entries on colums are $1$). The general result follows by induction.
add a comment |Â
up vote
1
down vote
If a Markov chain with state space $1, 2, dots, k$ is ergodic then its stationary distribution is its limiting distribution. As I read your question, it seems you are trying to show that
a chain with a doubly stochastic matrix has a uniform stationary distribution on
the state space.
You are using the convention that element $p_ij$ for $1 le i,j le k,$ of the transition matrix $mathbfP$ has $p_ij = P(X_n = j | X_n-1 = i).$
Let vector $sigma = (sigma_1, sigma_2, dots, sigma_k).$ If $sigma$ is uniform, then $sigma_i = 1/k,$ for $i = 1, dots, k.$
$$sigma_imathbfP = sum_j=1^k frac 1 k p_ij =
frac 1 ksum_j=1p_ij = frac 1 k(1) = frac 1 k,$$
where the last equality is due to the doubly stochastic property of the transition matrix. The argument for all $i = 1, dots k$ is the same so
$sigmamathsfP = sigma,$ and $sigma$ is a stationalry distribution.
If your question is not covered by this answer and the Answer of @Momo (+1), please
explain what part you don't understand, and maybe someone here can help.
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
$X_1=X_0P$, where $P$ is the transition matrix. As $X_0=[1/k,ldots,1/k]$, one would have $X_1^i=frac1k(P_2i+P_1i+ldots+P_ki)=frac1k$ by the double stochastic property (sum of the entries on colums are $1$). The general result follows by induction.
add a comment |Â
up vote
2
down vote
$X_1=X_0P$, where $P$ is the transition matrix. As $X_0=[1/k,ldots,1/k]$, one would have $X_1^i=frac1k(P_2i+P_1i+ldots+P_ki)=frac1k$ by the double stochastic property (sum of the entries on colums are $1$). The general result follows by induction.
add a comment |Â
up vote
2
down vote
up vote
2
down vote
$X_1=X_0P$, where $P$ is the transition matrix. As $X_0=[1/k,ldots,1/k]$, one would have $X_1^i=frac1k(P_2i+P_1i+ldots+P_ki)=frac1k$ by the double stochastic property (sum of the entries on colums are $1$). The general result follows by induction.
$X_1=X_0P$, where $P$ is the transition matrix. As $X_0=[1/k,ldots,1/k]$, one would have $X_1^i=frac1k(P_2i+P_1i+ldots+P_ki)=frac1k$ by the double stochastic property (sum of the entries on colums are $1$). The general result follows by induction.
answered Jul 15 at 1:32
Momo
11.9k21330
11.9k21330
add a comment |Â
add a comment |Â
up vote
1
down vote
If a Markov chain with state space $1, 2, dots, k$ is ergodic then its stationary distribution is its limiting distribution. As I read your question, it seems you are trying to show that
a chain with a doubly stochastic matrix has a uniform stationary distribution on
the state space.
You are using the convention that element $p_ij$ for $1 le i,j le k,$ of the transition matrix $mathbfP$ has $p_ij = P(X_n = j | X_n-1 = i).$
Let vector $sigma = (sigma_1, sigma_2, dots, sigma_k).$ If $sigma$ is uniform, then $sigma_i = 1/k,$ for $i = 1, dots, k.$
$$sigma_imathbfP = sum_j=1^k frac 1 k p_ij =
frac 1 ksum_j=1p_ij = frac 1 k(1) = frac 1 k,$$
where the last equality is due to the doubly stochastic property of the transition matrix. The argument for all $i = 1, dots k$ is the same so
$sigmamathsfP = sigma,$ and $sigma$ is a stationalry distribution.
If your question is not covered by this answer and the Answer of @Momo (+1), please
explain what part you don't understand, and maybe someone here can help.
add a comment |Â
up vote
1
down vote
If a Markov chain with state space $1, 2, dots, k$ is ergodic then its stationary distribution is its limiting distribution. As I read your question, it seems you are trying to show that
a chain with a doubly stochastic matrix has a uniform stationary distribution on
the state space.
You are using the convention that element $p_ij$ for $1 le i,j le k,$ of the transition matrix $mathbfP$ has $p_ij = P(X_n = j | X_n-1 = i).$
Let vector $sigma = (sigma_1, sigma_2, dots, sigma_k).$ If $sigma$ is uniform, then $sigma_i = 1/k,$ for $i = 1, dots, k.$
$$sigma_imathbfP = sum_j=1^k frac 1 k p_ij =
frac 1 ksum_j=1p_ij = frac 1 k(1) = frac 1 k,$$
where the last equality is due to the doubly stochastic property of the transition matrix. The argument for all $i = 1, dots k$ is the same so
$sigmamathsfP = sigma,$ and $sigma$ is a stationalry distribution.
If your question is not covered by this answer and the Answer of @Momo (+1), please
explain what part you don't understand, and maybe someone here can help.
add a comment |Â
up vote
1
down vote
up vote
1
down vote
If a Markov chain with state space $1, 2, dots, k$ is ergodic then its stationary distribution is its limiting distribution. As I read your question, it seems you are trying to show that
a chain with a doubly stochastic matrix has a uniform stationary distribution on
the state space.
You are using the convention that element $p_ij$ for $1 le i,j le k,$ of the transition matrix $mathbfP$ has $p_ij = P(X_n = j | X_n-1 = i).$
Let vector $sigma = (sigma_1, sigma_2, dots, sigma_k).$ If $sigma$ is uniform, then $sigma_i = 1/k,$ for $i = 1, dots, k.$
$$sigma_imathbfP = sum_j=1^k frac 1 k p_ij =
frac 1 ksum_j=1p_ij = frac 1 k(1) = frac 1 k,$$
where the last equality is due to the doubly stochastic property of the transition matrix. The argument for all $i = 1, dots k$ is the same so
$sigmamathsfP = sigma,$ and $sigma$ is a stationalry distribution.
If your question is not covered by this answer and the Answer of @Momo (+1), please
explain what part you don't understand, and maybe someone here can help.
If a Markov chain with state space $1, 2, dots, k$ is ergodic then its stationary distribution is its limiting distribution. As I read your question, it seems you are trying to show that
a chain with a doubly stochastic matrix has a uniform stationary distribution on
the state space.
You are using the convention that element $p_ij$ for $1 le i,j le k,$ of the transition matrix $mathbfP$ has $p_ij = P(X_n = j | X_n-1 = i).$
Let vector $sigma = (sigma_1, sigma_2, dots, sigma_k).$ If $sigma$ is uniform, then $sigma_i = 1/k,$ for $i = 1, dots, k.$
$$sigma_imathbfP = sum_j=1^k frac 1 k p_ij =
frac 1 ksum_j=1p_ij = frac 1 k(1) = frac 1 k,$$
where the last equality is due to the doubly stochastic property of the transition matrix. The argument for all $i = 1, dots k$ is the same so
$sigmamathsfP = sigma,$ and $sigma$ is a stationalry distribution.
If your question is not covered by this answer and the Answer of @Momo (+1), please
explain what part you don't understand, and maybe someone here can help.
answered Jul 15 at 6:35
BruceET
33.3k61440
33.3k61440
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2852052%2fhelp-showing-amarkov-chain-with-a-doubly-stochastic-matrix-has-uniform-limiting%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password