Mind the Gap

Mind the (data) gap, with Paul Nunn, SCOR P&C

April 9, 2018
In this series of blog posts, we talk with leading figures in the insurance industry about identifying and closing the data gap. In the first in this series, I'm joined by Paul Nunn, Head of Catastrophe Risk Modelling, at SCOR P&C.


PN = Paul Nunn

JC = Jim Craig


JC

Paul, can you start by giving your thoughts on the various initiatives around data that you can see at the moment.


PN 

Sure. We see a growing demand for hi-res flood data, as there is more flooding, driven by changing land use, plus regional impacts of climate change, so it's continually moving. This data would help with the scrutiny and challenge our understanding of new risk models.

Disclosures around climate risk are essential; insurance looks at risk today, and the impact in terms of costs now. In 2050, or 2100, what will the effect and costs be then? The scientific knowledge and data reflected in our models is mostly backward looking, and we need to incorporate the future IPCC pathways. The further out you go, the more difficult it is to bring people with you. What does a 5 years, or 10 years sea level rise look like? We need to be able to set management actions to allow the execution of a strategy to enable us to keep our insurance offering relevant.

People simply cannot focus on 2100, so we need to look ahead more practically and take baby steps into the future. 


JC

What sort of steps would you suggest, or like to see?


PN

If we take a hazard perspective, then better data to support wildfire analysis is essential. Record losses from wildfires trebled from $3.5Bn to $10Bn in 2017.

We would like datasets to validate wildfire models, maybe even a global wildfire database with a synthesised view. There are a lot of organisations building their wildfire models at the moment. The IPCC forecast is for a warmer world, which makes for bigger, more damaging fires. In Europe, we've seen this recently in Portugal and Spain, and also in North America: Alberta, California, and Colorado.

Extending this, specifically for the insurance sector, we would welcome a library of vulnerability functions. The library would translate hazard intensity (for example, wind speed) to damage level, based on the fragility of the structure, which would constitute a move into the engineering arena for Oasis Hub. 

The library would need to accommodate the differences in local building practices in different countries that exist with residential data.  Modern industrial facilities are similar the world over so vulnerability functions for these classes might be more portable.


JC

Would using satellite imagery that is updated very regularly, and mixed with building data help identify potential future losses through land use change?


PN

Insurers typically accept risks and live with the outcome for 12 months. We already have to work with a massively simplified representation of complex real world risks. For an individual homeowner insurer, the extra detail can be good.  For us, as a global reinsurer, it creates vast amounts of data to analyse for only limited additional insight. There is value if the question "How is the built environment affected by surface runoff?" is addressed in the macro view of a flood hazard model.

 

We see the ability to crowdsource data as being particularly useful, so being able to commission or provision a dataset through the help of the Oasis Hub community would be a welcome feature. 

blog comments powered by Disqus