Absorbing Markov Chains
- A Markov chain that includes one or more states that permanently trap the process (absorbing or sink states).
- Once the process enters an absorbing state it stays there indefinitely and cannot return to non-absorbing states.
- Used to model irreversible outcomes and to compute the probability of reaching those outcomes and the expected time to reach them.
Definition
Section titled “Definition”An absorbing Markov chain is a type of Markov chain in which there are one or more “absorbing” states, also known as “sink states”, that a system cannot leave once it enters them. This means that the system will remain in these states indefinitely, and the system will not return to any of the non-absorbing states.
Explanation
Section titled “Explanation”In an absorbing Markov chain some states are designated absorbing (sink) states. The process may move among non-absorbing states according to transition probabilities, but once it reaches an absorbing state it remains there forever. These chains are applicable when a particular outcome is irreversible: the chain either eventually reaches an absorbing state or continues among non-absorbing states until absorption.
Absorbing Markov chains are used to determine the likelihood that the process will reach an absorbing state and the expected time until absorption.
Examples
Section titled “Examples”Weather model
Section titled “Weather model”States: “sunny”, “cloudy”, and “rainy”.
If the system starts in the “sunny” state, it can either stay sunny or become cloudy with a certain probability. If it becomes cloudy, it can either stay cloudy or become rainy with a certain probability. If it becomes rainy, it will remain rainy and will not return to the sunny or cloudy states. In this case, the “rainy” state is the absorbing state.
Customer behavior model
Section titled “Customer behavior model”States: “shopping”, “deciding”, and “purchased”.
If a customer starts in the “shopping” state, they can either continue shopping or move to the “deciding” state with a certain probability. If they move to the “deciding” state, they can either continue deciding or move to the “purchased” state with a certain probability. Once they enter the “purchased” state, they will not return to the “shopping” or “deciding” states, and will remain in the “purchased” state indefinitely. In this case, the “purchased” state is the absorbing state.
Use cases
Section titled “Use cases”- Modeling systems with irreversible outcomes or states that cannot be escaped once reached.
- Computing the probability of reaching an absorbing state and the expected time to reach it.
Related terms
Section titled “Related terms”- Markov chain
- Absorbing state
- Sink state
- Non-absorbing state