- 4 Consider A Markov Chain Xn N 0 1 2 With State Space S 1 2 3 4 5 And Transition Matrix P 1 0 1 2 0 1 3 1 (18.75 KiB) Viewed 72 times
4. Consider a Markov chain {Xn,n=0,1,2,...} with state-space S = {1, 2, 3, 4, 5) and transition matrix P= 1 0 1/2 0 1/3
-
- Site Admin
- Posts: 899603
- Joined: Mon Aug 02, 2021 8:13 am
4. Consider a Markov chain {Xn,n=0,1,2,...} with state-space S = {1, 2, 3, 4, 5) and transition matrix P= 1 0 1/2 0 1/3
4. Consider a Markov chain {Xn,n=0,1,2,...} with state-space S = {1, 2, 3, 4, 5) and transition matrix P= 1 0 1/2 0 1/3 0 1/2 0 1 0 0 0 0 0 1/3 0 1/2 1/2 0 0 0 0 0 0 1/3/ Let Zo = 0 and for n=1,2,..., define Zn = [X.-Xn-il. a) Write down the state space of the stochastic process {Zn = 0,1,2,...}. b) Show that Zn is NOT a Markov chain by giving an example of where the Markov property breaks down.