Markov Decision Processes: A Tool for Sequential Decision Making under Uncertainty

Medical Decision Making - Tập 30 Số 4 - Trang 474-483 - 2010
Oğuzhan Alagöz1, Heather Hsu2, Andrew J. Schaefer3, Mark S. Roberts4
1Department of Industrial and Systems Engineering, University of Wisconsin-Madison, Madison, WI 53706, USA
2Department of Industrial Engineering, University of Pittsburgh, Pittsburgh, PA, Section of Decision Sciences and Clinical Systems Modeling, Division of General Medicine, and Department of Medicine, University of Pittsburgh School of Medicine, Pittsburgh, PA
3Department of Industrial Engineering, University of Pittsburgh, Pittsburgh, PA
4Department of Health Policy and Management, University of Pittsburgh Graduate School of Public Health, Pittsburgh, PA, Section of Decision Sciences and Clinical Systems Modeling, Division of General Medicine, and Department of Medicine, University of Pittsburgh School of Medicine, Pittsburgh, PA

Tóm tắt

We provide a tutorial on the construction and evaluation of Markov decision processes (MDPs), which are powerful analytical tools used for sequential decision making under uncertainty that have been widely used in many industrial and manufacturing applications but are underutilized in medical decision making (MDM). We demonstrate the use of an MDP to solve a sequential clinical treatment problem under uncertainty. Markov decision processes generalize standard Markov models in that a decision process is embedded in the model and multiple decisions are made over time. Furthermore, they have significant advantages over standard decision analysis. We compare MDPs to standard Markov-based simulation models by solving the problem of the optimal timing of living-donor liver transplantation using both methods. Both models result in the same optimal transplantation policy and the same total life expectancies for the same patient and living donor. The computation time for solving the MDP model is significantly smaller than that for solving the Markov model. We briefly describe the growing literature of MDPs applied to medical decisions.

Từ khóa


Tài liệu tham khảo

10.1145/167293.167816

10.1177/0272989X8300300403

10.1177/0272989X9701700201

10.1287/opre.1080.0648

10.1002/9780470316887

Schaefer AJ, Bailey MD, Shechter SM, Roberts MS Modeling medical treatment using Markov decision processes. In: Handbook of Operations Research/Management Science Applications in Health Care. Boston, MA: Kluwer Academic Publishers; 2004. 593-612.

Bertsekas DP, 2001, Dynamic Programming and Stochastic Control. Vols. 1 and 2

Bellman RE, 1957, Dynamic Programming

Denardo EV, 2003, Dynamic Programming: Models and Applications

10.1287/opre.29.5.971

10.1287/opre.44.5.696

10.1016/S0933-3657(99)00042-1

10.1287/mnsc.42.5.629

10.1287/mnsc.1040.0287

10.1287/opre.1060.0329

10.1287/mnsc.1070.0726

10.1287/opre.1070.0480

10.1007/978-3-540-68405-3_20

10.1287/opre.1080.0614

Faissol D., 2006, Swann J. Timing of Testing and Treatment of Hepatitis C and Other Diseases

Chhatwal J., 29th Annual Meeting of the Society for Medical Decision Making

10.1177/0272989X08329462

10.1080/07408170802165872

Kreke JE, 2007, Decisions for Patients with Pneumonia-Related Sepsis [PhD dissertation]

Kurt M., At what lipid ratios should a patient with type 2 diabetes initiate statins?

Gold MR, 1996, Cost-Effectiveness in Health and Medicine, 10.1093/oso/9780195108248.001.0001

Winston WL, 1997, Operations Research: Applications and Algorithms

10.1177/0272989X05282719

10.1002/lt.20137

Alagoz O., 2004, Optimal Policies for the Acceptance of Living- and Cadaveric-Donor Livers [PhD dissertation]

2006, Pro T Suite [computer program]. Version release 1

10.1002/9780470182963