Title: Assessment of emergency response policy based on Markov process

Authors: Yafei Zhou; Mao Liu

Addresses: Center for Security and Emergency Technology Research, China Waterborne Transport Research Institute, Beijing 100088, China ' Center for Urban Public Safety Research, Nankai University, Tianjin, 300071, China

Abstract: Major accidents not only endanger the health and safety of the population, but also bring bad influence to the environment around. In order to mitigate the adverse effects, emergency response policy (ERP) and corresponding protection action should be established. To maximise its effectiveness, ERP should be evaluated and optimised, while the most important criteria is minimising the health consequence of the accident. A discrete state stochastic Markov process was used to simulate the movement of the evacuees in this paper. Solution of the Markov process provided the expected distribution of the evacuees in the area as a function of time. Then, according to the way how extreme phenomena impact individual and the dose-response relationship, the people's health effects were calculated, so that the accident's health consequence was determined. Finally, different emergency response policies were evaluated with corresponding health consequence, so that the emergency policy can be optimised.

Keywords: emergency response policy; stochastic modelling; Markov process; evacuation models; health consequences; emergency management; disaster management; disaster response; evacuees; evacuee movements; simulation; accidents.

DOI: 10.1504/IJGCRSIS.2013.057235

International Journal of Granular Computing, Rough Sets and Intelligent Systems, 2013 Vol.3 No.2, pp.95 - 105

Available online: 18 Oct 2013 *

Full-text access for editors Access for subscribers Purchase this article Comment on this article