Traditionally, earthquake impact assessments have been made via fieldwork by non-governmental organisations (NGO's) sponsored data collection; however, this approach is time-consuming, expensive and often limited. Recently, social media (SM) has become a valuable tool for quickly collecting large amounts of first-hand data after a disaster and shows great potential for decision-making. Nevertheless, extracting meaningful information from SM is an ongoing area of research. This paper tests the accuracy of the pre-trained sentiment analysis (SA) model developed by the no-code machine learning platform MonkeyLearn using the text data related to the emergency response and early recovery phase of the three major earthquakes that struck Albania on the 26th November 2019. These events caused 51 deaths, 3000 injuries and extensive damage. We obtained 695 tweets with the hashtags: #Albania #AlbanianEarthquake, and #albanianearthquake from the 26th November 2019 to the 3rd February 2020. We used these data to test the accuracy of the pre-trained SA classification model developed by MonkeyLearn to identify polarity in text data. This test explores the feasibility to automate the classification process to extract meaningful information from text data from SM in real-time in the future. We tested the no-code machine learning platform's performance using a confusion matrix. We obtained an overall accuracy (ACC) of 63% and a misclassification rate of 37%. We conclude that the ACC of the unsupervised classification is sufficient for a preliminary assessment, but further research is needed to determine if the accuracy is improved by customising the training model of the machine learning platform.
Disasters affect millions of people annually, causing large numbers of fatalities, detrimental economic impact and the displacement of communities. Policy-makers, researchers and industry professionals are regularly faced with these consequences and therefore require tools to assess the potential impacts and provide sustainable solutions, often with only very limited information. This paper focuses on the themes of "disaster management", "natural hazards" and "simulation", aiming to identify current research trends using bibliometric analysis. This analysis technique combines quantitative and statistical methods to identify these trends, assess quality and measure development. The study has concluded that natural hazards (73%) are more predominant in research than man-made hazards (14%). Of the man-made hazards covered, terrorism is the most prevalent (83%). The most frequent disaster types are climate related, and in this study hydrological (20%), geophysical (20%), meteorological (15%) and climatological (5%) were the most frequently researched. Asia experiences the highest number of disaster events as a continent but in this study was only included in 11% of papers, with North America being the most recurrent (59%). There were some surprising omissions, such as Africa, which did not feature in a single paper. Despite the inclusion of key words "simulation" and "agent based" in the searches, the study did not demonstrate there is a large volume of research being carried out using numerical modelling techniques. Finally, research is appearing to take a reactive rather than proactive approach to disaster management planning, but the merit of this approach is questionable.
Disaster events cause detrimental impacts for communities across the globe, ranging from large numbers of fatalities and injuries, to the loss of homes and devastating financial impacts. Emergency professionals are faced with the challenge of providing sustainable solutions to mitigate these consequences and require tools to aid the assessment of potential impacts. Current modelling tools have either focused on modelling either the microscale (e.g. individual confined spaces such as buildings or stadiums) or the macroscale (e.g. city scale). The aim of this research is to create microscale agent-based modelling (ABM) tools, incorporating a realistic representation of human behaviours, which will help management professionals assess and improve their contingency plans for emergency scenarios. The focus has been on creating a microscale agent-based model of a pedestrian pavement and crossroads, to include overtaking and giving way, alongside the inclusion of varied population characteristics. This research has found that by improving pedestrian interactions (e.g. overtaking and giving way interactions) on pavements and at crossroads more robust travel time estimates can be achieved. To produce more realistic behaviour traits, microscale models should consider: (1) varied walking speed, (2) population density, (3) patience level and (4) an exit split percentage for crossroads. Comparisons to 1.34 m/s (3mph) models without additional variables show the travel times may be misrepresentative by up to 78% in pavements and 305% in crossroads for some population types. This has the potential to cause cascading effects such as a significant increase in fatalities or injuries as communities cannot reach safety in the anticipated time.
The 2010 eruption of the Eyjafjallajökull volcano had a devastating effect on the European air traffic network, preventing air travel throughout most of Europe for 6 days (Oroian in ProEnvironment 3:5–8, 2010). The severity of the disruption was surprising as previous research suggests that this type of network should be tolerant to random hazard (Albert et al. in Nature 406(6794):378–382, 2000; Strogatz in Nature 410(6825):268–276, 2001). The source of this hazard tolerance lies in the degree distribution of the network which, for many real-world networks, has been shown to follow a power law (Albert et al. in Nature 401(6749):130–131, 1999; Albert et al. in Nature 406(6794):378–382, 2000). In this paper, we demonstrate that the ash cloud was unexpectedly disruptive because it was spatially coherent rather than uniformly random. We analyse the spatial dependence in air traffic networks and demonstrate how the combination of their geographical distribution and their network architectures jeopardises their inherent hazard tolerance.
Earthquakes are one of the most catastrophic natural phenomena. After an earthquake, earthquake reconnaissance enables effective recovery by collecting data on building damage and other impacts. This paper aims to identify state-of-the-art data sources for building damage assessment and provide guidance for more efficient data collection. We have reviewed 39 articles that indicate the sources used by different authors to collect data related to damage and post-disaster recovery progress after earthquakes between 2014 and 2021. The current data collection methods have been grouped into seven categories: fieldwork or ground surveys, omnidirectional imagery (OD), terrestrial laser scanning (TLS), remote sensing (RS), crowdsourcing platforms, social media (SM) and closed-circuit television videos (CCTV). The selection of a particular data source or collection technique for earthquake reconnaissance includes different criteria depending on what questions are to be answered by these data. We conclude that modern reconnaissance missions cannot rely on a single data source. Different data sources should complement each other, validate collected data or systematically quantify the damage. The recent increase in the number of crowdsourcing and SM platforms used to source earthquake reconnaissance data demonstrates that this is likely to become an increasingly important data source.
Disasters affect millions of people annually, causing large social impacts, including high numbers of fatalities, the displacement of communities, and detrimental economic impacts. Emergency professionals recurrently tackle these impacts and therefore need assessment methods to understand potential consequences and deliver sustainable resolutions, regularly with incomplete information. Current models simulating human behaviours and movement exist but are bespoke in nature and non-transferable (solving one problem only), meaning it is not possible to keep software "current" or future proofed. The aim of this research is to create an agent-based modelling (ABM) framework tool, incorporating robust models of human behaviour, to help management professionals develop and test their contingency plans for emergency scenarios. The focus has been on creating a macroscale evacuation ABM for a case study area, to assess whether the inclusion of varied population characteristics and group behaviours affect evacuation time. This research has found that by enhancing the representation of human behaviour more accurate predictions of evacuation time can be produced. To produce more robust human behaviour, models must include: (1) population characteristics (such as age, sex, and mobility), (2) grouping of agents, and (3) a walking speed ratio. Without the inclusion of adequate population characteristics, e.g. using agents with only a 1.34 m/s (3mph) walking speed, the evacuation model produced misleading evacuation times, potentially increasing by 109% for some population types. This may result in knock-on effects, such as significant increases in fatalities, or injuries as populations cannot leave their homes to safety in the expected time.
Abstract This paper presents an investigation of the collapse of a 325-year-old multi-tiered heritage temple during the 2015 Gorkha earthquake in Kathmandu, Nepal. The research comprises a reconnaissance survey followed by a geotechnical investigation and numerical back-analysis carried out to understand the potential causes of the collapse. The assessment of the structural configuration of the temple indicated seismic vulnerability in the design due to the presence of discontinuous columns over the height of the temple and age-weakened bonding in the masonry walls. The geotechnical investigation revealed the presence of competent soil strata at the location, assisting the survey which indicated no differential or excessive settlement in the foundation. A series of cyclic triaxial tests were conducted on samples recovered during the geotechnical investigation to determine dynamic behaviour of the soil. Further, dynamic analysis of the plinth of the temple under the recorded acceleration–time history indicated a maximum drift percentage of 1.4% and residual relative displacement of 32 mm suggesting the potential reason behind the collapse. The output of this research will support seismic rehabilitation of ancient structures within World Heritage sites across Nepal and effective action plans to safeguard them against future earthquake hazard.