- 3 Consider A Markov Chain With State Space S 1 2 3 4 5 6 And Transition Probability Matrix P 0 0 0 3 0 0 0 0 1 (54.18 KiB) Viewed 23 times
3. Consider a Markov chain with state space S = {1, 2, 3, 4, 5, 6} and transition probability matrix P = 0 0 0.3 0 0 0 0
-
- Site Admin
- Posts: 899603
- Joined: Mon Aug 02, 2021 8:13 am
3. Consider a Markov chain with state space S = {1, 2, 3, 4, 5, 6} and transition probability matrix P = 0 0 0.3 0 0 0 0
3. Consider a Markov chain with state space S = {1, 2, 3, 4, 5, 6} and transition probability matrix P = 0 0 0.3 0 0 0 0.5 0 1 0 0 0.2 0.1 0.7 0 0 0 0 0 0.1 0.4 0 0 0 0 0.4 0 0.3 0 1 0 0 0 1 (a) Compute Po. (b) If the process starts in state 3, what are the probabilities that it will be absorbed in state 2, state 5, and state 6, respectively?