3. Consider a Markov chain with state space S = {1, 2, 3, 4, 5, 6} and transition probability matrix P = 0 0 0.3 0 0 0 0
Posted: Fri Jul 01, 2022 9:08 am
3. Consider a Markov chain with state space S = {1, 2, 3, 4, 5, 6} and transition probability matrix P = 0 0 0.3 0 0 0 0.5 0 1 0 0 0.2 0.1 0.7 0 0 0 0 0 0.1 0.4 0 0 0 0 0.4 0 0.3 0 1 0 0 0 1 (a) Compute Po. (b) If the process starts in state 3, what are the probabilities that it will be absorbed in state 2, state 5, and state 6, respectively?