Consider the Markov chain defined on states S = {0, 1, 2, 3} whose transition probability matrix is P : ( 1    0    0   0       0.2 0.3 0.1 0.4       0.3 0.1 0.5 0.1       0    0     0   1   ) It is known that the process starts in state 1. (a) Determine the probability that the Markov chain ends in state 0. (b) Determine the mean time that the process spends in state 1 prior to absorption. (c) Determine the mean time that the process spends in state 2 prior to absorption. (d) Determine the mean time to absorption.

Elementary Linear Algebra (MindTap Course List)
8th Edition
ISBN:9781305658004
Author:Ron Larson
Publisher:Ron Larson
Chapter2: Matrices
Section2.5: Markov Chain
Problem 49E: Consider the Markov chain whose matrix of transition probabilities P is given in Example 7b. Show...
icon
Related questions
Question

Q: Consider the Markov chain defined on states S = {0, 1, 2, 3} whose transition probability
matrix is

P : ( 1    0    0   0

      0.2 0.3 0.1 0.4

      0.3 0.1 0.5 0.1

      0    0     0   1   )

It is known that the process starts in state 1.
(a) Determine the probability that the Markov chain ends in state 0.
(b) Determine the mean time that the process spends in state 1 prior to absorption.
(c) Determine the mean time that the process spends in state 2 prior to absorption.
(d) Determine the mean time to absorption.

Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 4 steps

Blurred answer