Term: Markov Model

Glossary Definition

Last Updated: 2015-12-03

Definition:

The Markov model is a modelling approach that predicts how likely an event is to enter a specific state in the future based on the current state of this event and chance (Last, 2001). This approach can use constant likelihood of each event state (stationary model) or changing likelihood of each event state (non-stationary model) (Schaubel et al., 1998). A non-stationary model can account for the influence of factors that might impact the likelihood of future events.

The Markov model can evaluate the outcomes of a process, such as whether a patient diagnosed with kidney failure who enters the Manitoba Renal Program (MRP) will receive hemodialysis, peritoneal dialysis, home hemodialysis, kidney transplant, or will die. For more information on how this modelling approach was used in Chatier et al. (2015), please read Appendix 2: Markov Model Transitional Probability Matrix in this report.

Related terms 

References 

Term used in