5/24/2019

The Use of UAVs in Humanitarian Relief (無人機在人道主義救濟中的應用)

Raissa Zurli Bittencourt Bravo, Adriana Leiras, and Fernando Luiz Cyrino Oliveira, The Use of UAVs in Humanitarian Relief: An Application of POMDP-Based Methodology for Finding Victims, Production and Operations Management,  Vol. 28, No. 2, February 2019, pp. 421–440.
Researchers have proposed the use of unmanned aerial vehicles (UAVs) in humanitarian relief to search for victims in disaster-affected areas. Once UAVs must search through the entire affected area to find victims, the path-planning operation becomes equivalent to an area coverage problem. In this study, we propose an innovative method for solving such problem based on a Partially Observable Markov Decision Process (POMDP), which considers the observations made from UAVs. The formulation of the UAV path planning is based on the idea of assigning higher priorities to the areas that are more likely to have victims. We applied the method to three illustrative cases, considering different types of disasters: a tornado in Brazil, a refugee camp in South Sudan, and a nuclear accident in Fukushima, Japan. The results demonstrated that the POMDP solution achieves full coverage of disaster-affected areas within a reasonable time span. We evaluate the traveled distance and the operation duration (which were quite stable), as well as the time required to find groups of victims by a detailed multivariate sensitivity analysis. The comparisons with a Greedy Algorithm showed that the POMDP finds victims more quickly, which is the priority in humanitarian relief, whereas the performance of the Greedy focuses on minimizing the traveled distance. We also discuss the ethical, legal, and social acceptance issues that can influence the application of the proposed methodology in practice.
methodology:
modeling, scenarios generation, solving, and analyzing statistics...
The first modeling step concerns regarding mapping the affected area to be flown and choosing the type of UAV to be used, which is related to the necessary flight height.In this step, the area to be flown can be viewed as a Markov Chain, where each state is a part of the affected area and has a priority, which represents the probability of having victims. 
The scenario generation step considers a hypothetical distribution of victims in the affected area to test the solver in terms of time to find victims, distance traveled, operation’s duration and coverage percentage. We have considered three scenarios (best, mixed, and worst) that suppose that the victims are in cells with some population density (which can be seen through the urban areas in the map). 
The solving step repeats a policy execution until the UAV covers the entire affected area.
The analyzing statistics step consists of calculating four indicators: traveled distance, operation duration, coverage percentage, and time to find groups of victims. 
Mathematical modeling:
In all the cases, the states have priorities from 1 to 3, according to the probability of having affected people. We programmed the solver to have eight actions (north, south, east, west, north-east, north-west, south-east, south-west), two observations (y – yes, there is a victim, n – no, there is not a victim) and two rewards (1 for finding victims or 0 for not finding victims). Detail in the appendix file.
Comparison:

Forcing the greedy algorithm to travel over the entire area, in the Xanxer^e example, the results showed that the average distance traveled under the greedy algorithm was 0.3 km longer than that traveled under the POMDP (for the mixed scenario), the average operation duration was 0.33 minutes longer than that under the POMDP, and the average time to find groups of victims was 2 minutes longer than that under the POMDP. The most significant difference was the time to find groups of victims: the POMDP was 20% faster than the greedy algorithm because the POMDP is biased to save lives, updating its belief at each iteration through observations, whereas the greedy algorithm focuses on minimizing the distance traveled.

沒有留言:

張貼留言