Title: A Markov decision process model of allocating emergency medical resource among multi-priority injuries

Authors: Yan Ni; Ke Wang; Lindu Zhao

Addresses: School of Economics and Management, Southeast University, Nanjing, Jiangsu 210096, China ' School of Economics and Management, Southeast University, Nanjing, Jiangsu 210096, China ' School of Economics and Management, Southeast University, Nanjing, Jiangsu 210096, China

Abstract: This article studies how to allocate a scare emergency medical resource at the beginning of large-scale disaster. One salient problem in emergency rescue is how to coordinate the resource among injuries of different degrees in different periods. The injuries are classified into three levels and given different priorities. Waiting cost with resources shortage, deteriorated cost with delay and transferring cost to other hospitals are introduced into the decision-making process. Basing on incorporating the costs into revenue function, a Markov decision process (MDP) model is proposed to establish the optimal action policies in different periods. A dynamic algorithm is proposed to derive the optimal solution. With a numerical experiment, the improvement of MDP is displayed and some managerial suggestions are also provided.

Keywords: emergency management; medical resource allocation; Markov decision process; multi-priority injuries; operational research; process modelling; emergency medical resources; large-scale disasters; emergency response; waiting cost; resource shortages; deterioration cost; delay; transfer cost; hospitals.

DOI: 10.1504/IJMOR.2017.080738

International Journal of Mathematics in Operational Research, 2017 Vol.10 No.1, pp.1 - 17

Received: 10 Nov 2014
Accepted: 27 Jan 2015

Published online: 06 Dec 2016 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article