Markov chain - prob. of visiting some state before another, number of visits at intermediate state
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
For a finite Markov chain with transition matrix given by:
$P=left[ matrix1over 3 & 2over 3&0&0 &0 &0 \ 2over 3&0&1over 3&0 &0&0\0&1over 3&0&2 over 3&0&0\0&0&2over 3 &0 &1over 3&0\0&0&0&1over 3 &0&2over 3\0&0&0&0&2 over 3&1over 3right]$
I need to find:
- the probability of visiting state 6 before visiting state 1, given
that my initial state is 2 (so that my initial distribution vector
is $mu_0=left[ matrix0 & 1 &0 &0&0&0right]$). - The mean number of visits at state 1, until returning to state 6, and assuming $X_0=6$.
My attempts didn't help so much.
As for part one:
I tried solving this according to a similar question, by denoting $p_i$ as the probability of visiting state 6 if I'm currently at state $iinlbrace1,dots ,6rbrace$, but I got a system of linear equations where all $p_i=1$... and generally I had trouble formalizing this question in terms of the random variables of the states.
As for part two:
Denoting $t^*_6=mathbb Eleft[T_6|X_0=6right]$ as the mean recurrence time for state 6, I tried solving the system given by $t_i=1+sum_ineq 6p_ijt_i$ but it became a mess. Also since the stationary distribution is $pi=left[ matrix1over 6 & dots & 1over 6right]$, and the chain is aperiodic and irreducable, it seems that $t_6^8=6$, but it still doesn't answer the question, as for how many times in average state 1 is visited.
Any help would be appreciated.
probability markov-chains
add a comment |Â
up vote
0
down vote
favorite
For a finite Markov chain with transition matrix given by:
$P=left[ matrix1over 3 & 2over 3&0&0 &0 &0 \ 2over 3&0&1over 3&0 &0&0\0&1over 3&0&2 over 3&0&0\0&0&2over 3 &0 &1over 3&0\0&0&0&1over 3 &0&2over 3\0&0&0&0&2 over 3&1over 3right]$
I need to find:
- the probability of visiting state 6 before visiting state 1, given
that my initial state is 2 (so that my initial distribution vector
is $mu_0=left[ matrix0 & 1 &0 &0&0&0right]$). - The mean number of visits at state 1, until returning to state 6, and assuming $X_0=6$.
My attempts didn't help so much.
As for part one:
I tried solving this according to a similar question, by denoting $p_i$ as the probability of visiting state 6 if I'm currently at state $iinlbrace1,dots ,6rbrace$, but I got a system of linear equations where all $p_i=1$... and generally I had trouble formalizing this question in terms of the random variables of the states.
As for part two:
Denoting $t^*_6=mathbb Eleft[T_6|X_0=6right]$ as the mean recurrence time for state 6, I tried solving the system given by $t_i=1+sum_ineq 6p_ijt_i$ but it became a mess. Also since the stationary distribution is $pi=left[ matrix1over 6 & dots & 1over 6right]$, and the chain is aperiodic and irreducable, it seems that $t_6^8=6$, but it still doesn't answer the question, as for how many times in average state 1 is visited.
Any help would be appreciated.
probability markov-chains
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
For a finite Markov chain with transition matrix given by:
$P=left[ matrix1over 3 & 2over 3&0&0 &0 &0 \ 2over 3&0&1over 3&0 &0&0\0&1over 3&0&2 over 3&0&0\0&0&2over 3 &0 &1over 3&0\0&0&0&1over 3 &0&2over 3\0&0&0&0&2 over 3&1over 3right]$
I need to find:
- the probability of visiting state 6 before visiting state 1, given
that my initial state is 2 (so that my initial distribution vector
is $mu_0=left[ matrix0 & 1 &0 &0&0&0right]$). - The mean number of visits at state 1, until returning to state 6, and assuming $X_0=6$.
My attempts didn't help so much.
As for part one:
I tried solving this according to a similar question, by denoting $p_i$ as the probability of visiting state 6 if I'm currently at state $iinlbrace1,dots ,6rbrace$, but I got a system of linear equations where all $p_i=1$... and generally I had trouble formalizing this question in terms of the random variables of the states.
As for part two:
Denoting $t^*_6=mathbb Eleft[T_6|X_0=6right]$ as the mean recurrence time for state 6, I tried solving the system given by $t_i=1+sum_ineq 6p_ijt_i$ but it became a mess. Also since the stationary distribution is $pi=left[ matrix1over 6 & dots & 1over 6right]$, and the chain is aperiodic and irreducable, it seems that $t_6^8=6$, but it still doesn't answer the question, as for how many times in average state 1 is visited.
Any help would be appreciated.
probability markov-chains
For a finite Markov chain with transition matrix given by:
$P=left[ matrix1over 3 & 2over 3&0&0 &0 &0 \ 2over 3&0&1over 3&0 &0&0\0&1over 3&0&2 over 3&0&0\0&0&2over 3 &0 &1over 3&0\0&0&0&1over 3 &0&2over 3\0&0&0&0&2 over 3&1over 3right]$
I need to find:
- the probability of visiting state 6 before visiting state 1, given
that my initial state is 2 (so that my initial distribution vector
is $mu_0=left[ matrix0 & 1 &0 &0&0&0right]$). - The mean number of visits at state 1, until returning to state 6, and assuming $X_0=6$.
My attempts didn't help so much.
As for part one:
I tried solving this according to a similar question, by denoting $p_i$ as the probability of visiting state 6 if I'm currently at state $iinlbrace1,dots ,6rbrace$, but I got a system of linear equations where all $p_i=1$... and generally I had trouble formalizing this question in terms of the random variables of the states.
As for part two:
Denoting $t^*_6=mathbb Eleft[T_6|X_0=6right]$ as the mean recurrence time for state 6, I tried solving the system given by $t_i=1+sum_ineq 6p_ijt_i$ but it became a mess. Also since the stationary distribution is $pi=left[ matrix1over 6 & dots & 1over 6right]$, and the chain is aperiodic and irreducable, it seems that $t_6^8=6$, but it still doesn't answer the question, as for how many times in average state 1 is visited.
Any help would be appreciated.
probability markov-chains
edited Jul 26 at 15:03
asked Jul 25 at 22:07
gbi1977
1247
1247
add a comment |Â
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
2
down vote
Let $p_i$ denote the probability that you reach state $6$ before state $1$ starting from state $i$. Then $p_2=frac13p_3$ and $p_3=frac13p_2+frac23p_4$. By the symmetry of the problem, $p_4=1-p_3$. Substituting the first and third equations into the second yields $p_3=frac19p_3+frac23(1-p_3)$, with solution $p_3=frac37$ and thus $p_2=frac17$.
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
Let $p_i$ denote the probability that you reach state $6$ before state $1$ starting from state $i$. Then $p_2=frac13p_3$ and $p_3=frac13p_2+frac23p_4$. By the symmetry of the problem, $p_4=1-p_3$. Substituting the first and third equations into the second yields $p_3=frac19p_3+frac23(1-p_3)$, with solution $p_3=frac37$ and thus $p_2=frac17$.
add a comment |Â
up vote
2
down vote
Let $p_i$ denote the probability that you reach state $6$ before state $1$ starting from state $i$. Then $p_2=frac13p_3$ and $p_3=frac13p_2+frac23p_4$. By the symmetry of the problem, $p_4=1-p_3$. Substituting the first and third equations into the second yields $p_3=frac19p_3+frac23(1-p_3)$, with solution $p_3=frac37$ and thus $p_2=frac17$.
add a comment |Â
up vote
2
down vote
up vote
2
down vote
Let $p_i$ denote the probability that you reach state $6$ before state $1$ starting from state $i$. Then $p_2=frac13p_3$ and $p_3=frac13p_2+frac23p_4$. By the symmetry of the problem, $p_4=1-p_3$. Substituting the first and third equations into the second yields $p_3=frac19p_3+frac23(1-p_3)$, with solution $p_3=frac37$ and thus $p_2=frac17$.
Let $p_i$ denote the probability that you reach state $6$ before state $1$ starting from state $i$. Then $p_2=frac13p_3$ and $p_3=frac13p_2+frac23p_4$. By the symmetry of the problem, $p_4=1-p_3$. Substituting the first and third equations into the second yields $p_3=frac19p_3+frac23(1-p_3)$, with solution $p_3=frac37$ and thus $p_2=frac17$.
answered Jul 25 at 22:16
joriki
164k10180328
164k10180328
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2862872%2fmarkov-chain-prob-of-visiting-some-state-before-another-number-of-visits-at%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password