subscribe: Daily Newsletter

 

Analytics for predicting, managing disasters

0 comments

In two days, severe storms wreaked havoc across two provinces in South Africa. KwaZulu-Natal was declared a provincial disaster area after torrential rains left at least eight people dead and flooding caused massive damage. In Gauteng, at least three people died and hundreds were left homeless when a tornado and hailstorms ripped through the East and West Rand.
It sounds eerily familiar, doesn’t it? In November last year, flash floods left motorists stranded and caused millions of rands in damages.
November 2016, October 2017, severe storms, flash floods, destructive hail… are you also seeing a pattern?
While there is nothing we can do to prevent natural disasters or severe weather events, we can use big data and advanced analytics to help us respond to disaster faster and more effectively. By analysing a combination of historical data points and applying them to new data, before, during and after a natural disaster occurs, emergency services can be a lot more targeted with their relief efforts.

Before a disaster occurs
According to Aneshan Ramaloo, senior business solutions manager at SAS, the effective use of analytics to study data related to geography, population, mobile device usage and many other data points can help authorities discern underlying patterns and associations that will enable the relevant emergency services to quickly react to floods, fires and other deadly scenarios.
“Emergency response agencies can even model potential disasters and their effects, to allow authorities to develop proactive plans to prevent these outcomes from occurring,” he says.
This can be achieved by utilising data from synthetic global populations with detailed demographics, family structures, travel patterns and activities. Constructions are done in such a way as to mirror real census, social, transit and telecoms data patterns, meaning that authorities effectively build virtual cities that they can test various disaster management strategies on.

During a disaster
“The real power of analytics is that it can utilise big data derived from multiple sources, such as population demographics, weather patterns, flood zones, town planning and even cartography, which is then built into a disaster management strategy.
“Performing real-time advanced analytics on all of this information will enable authorities to provide quicker, more effective responses to areas even as the disaster may be occurring.
“For example, authorities can direct rescue crews or fire fighters to those areas most in need. By having immediate access to all crisis data, analytics can help staff save lives. It can also be used for prescribing the kind of medicines, food and medical equipment that will be required in specific areas.
“Authorities can also use analytics to determine where both the safest and most optimal areas are to set up treatment centres. This would entail taking various factors into account, such as traffic, road networks, nearby hospitals, closest supply centres and infected populations.
“A Canadian insurer did exactly this in 2013 after torrential rains — described as the costliest natural disaster in Canadian history — caused the evacuation of 100,000 people and the death of four. SAS researchers used data from that disaster to model a catastrophe that allowed executives to monitor the amount of incoming claims, and by geocoding them, mapping them out to see which customers and brokers were most affected.”

After a disaster
While it is impossible to prevent natural disasters from occurring, it is possible to utilise data derived from previous catastrophes to mitigate future ones. Ramaloo points out that information related to weather and climate, road, utility and environmental vulnerabilities can play a major role in helping emergency management services to plot the emergency strategies of tomorrow.
“This approach applies to people as well. Government can make use of data related to population subsets — such as elderly communities, infant and youth concentrations and areas where individuals need specific mobility support — to ensure that, should a disaster strike, responders can apply resources to those locations.
“After the Nepal earthquake, the International Organization for Migration, as the first responder, reacted to provide temporary shelter and to coordinate other relief agencies providing the essentials such as food and water. SAS helped the IOM analyse data to determine where the high-risk shelters were and to better allocate resources. While most of the relief efforts focused on Kathmandu, the data showed that another nearby district had more children and therefore needed more diapers, formula, children’s medicine and other supplies for nursing mothers. These were quick, but important, insights to guide relief efforts.”
With big data, it is even possible to understand how residents may react to a future catastrophe. Data scientists can extract data sets from the details captured by local mobile network operators to comprehend how populations move in response to an emergency situation, such as floods.
“Of course, for data to be effective, it must be shareable. By bringing together various sources of data, this helps to create a kind of spatial data infrastructure, making the development of policies, protocols and ways to exchange information an ongoing priority, leading to the creation of new best-case scenarios.
“Analysing data like this can even prove useful in situations where no previous disasters have occurred. Such data can still provide authorities with information about how potential disasters may impact on a particular region. This, in turn, means that government can develop proactive plans, in order to be ready for any eventuality — in any area — no matter how unlikely,” he concludes.