This talk introduces Markov decision process (MDP) and partially observable Markov decision process (POMDP) and their application to medical decision making. The focus of the talk will be the application of POMDPs to breast cancer screening decisions. Determining when to screen for breast cancer with mammography is a complex problem involving multiple decision makers with competing objectives. Questions regarding the relative value and frequency of mammography screening for women remain open due to the conflicting age-based dynamics of both the disease (increasing incidence, decreasing aggression) and the accuracy of the test results (increasing sensitivity and specificity). To investigate these questions, we formulate a partially observed Markov chain model that captures several of these age-based dynamics not previously considered simultaneously. This model incorporates uncertainty associated with the partial observability of the disease by the decision maker. Using sample path enumeration, we evaluate a broad range of policies to generate the set of “efficient” policies, as measured by lifetime breast cancer mortality risk and an expected mammogram count, from which a patient may select a policy based on individual circumstance. We discuss extensions of this research to develop personalized screening policies by age, race, and comorbidity. We use population-based Carolina Mammography Registry data to develop competing risks models for estimating mortality probabilities for breast cancer patients as a function of age, race, stage at detection, and screening behaviors.
Back to IPC Short Course: Operations Research or the Mathematics of Strategic Decision Making