Mobito facilitates the exchange of data between companies through an advanced data platform that allows companies to streamline their interaction with third party data and monetize their own data. A key industry we are focused on is the insurance market where incumbents and challengers are increasingly relying on external data to develop new products, improve their services and deliver better and more personalized user experiences. Below, we outline some of the developments that underpin the industry’s evolution and the key needs and challenges around successful data usage.
Insurance is principally involved in the assessment and pricing of risk. It offers individuals, corporations and governments protection against future adverse events that may cause them financial loss. In its most typical model, the insured pays some recurring amount, i.e. the price of insurance products, to buy protection against specific damages that may affect an insured asset. Fundamentally, this price depends on the expected cost of the damage incurred and the likelihood of it happening.
All else equal, property flood insurance in the coastal Shanghai will probably cost more than buying the same insurance in Texas. In this sense, the job of the insurer is to adequately price insurance products so that on average, across a portfolio of clients, the insurer can be profitable, while offering competitive rates.
Traditionally, insurers have relied on long historical records in order to understand the likelihood or frequency of an event happening. Analysing 50 years of flood records may allow an insurer to map probabilistically future likelihood of flood events and underpin their pricing and product features. In this case the model would need to also account for the fact that climate change has rendered historical data less representative as the location, frequency and severity of flood events is changing. In this sense, a lot of traditional insurance risk pricing relies on analyzing reliable historical data.
Importantly, when a claim is made for an insured product, say after hail damages an insured car, the underwriter needs to verify certain conditions are met and assess the extent of the damage, before releasing the claim- a process that can take months. In the 6 years between 2008 and 2014 the $ spent by US insurers for auto insurance claims against hail amounted to to $5.37 billion.
Two key drivers in improving insurers’ portfolio performance and the associated pricing of underlying risk are:
1) integrating more & more relevant data in the risk models and
2) advancements in the data processing to improve the quality and predictive capability of the output of these models.
Times are rapidly changing and there has been a confluence of technological and societal changes that are driving new developments in the insurance market.
These include the rapid growth of data capturing IOT devices, advancements in machine learning and changing consumer habits requesting more personalised and responsive products. Not surprisingly, the ability to gather more data from the real world matched with advances that allow new and smarter ways of making sense of such data, is generating brand new opportunities for insurance products.
Third party captured data support insurers in the following 3 core business activities:
i) better pricing of risk, by enriching models with previously unavailable external data
ii) automating part of the claims processing or verifying claims damages, by accessing real world data-evidence,
iii) improving user experience through real time personalised, context-specific alerts and other services.
Automating part of the claims processing
A category of the insurance landscape that has been embracing a lot of these new capabilities is Parametrics Insurance. Companies in this space are proponents of a new model that disrupts the processing of claims of insurance products. A parametric insurance product usually includes a predefined threshold of some data measurement or parameter, which if reached can automatically or semi-automatically trigger the release of a claim. This way, parametric insurance pays a predefined amount after the occurrence of a triggering event. For example, companies might sell home sensors that measure flood intensity (e.g. in cm) so that when 4 cm of flood is reached locally, the insurer releases a predefined payment. Similar insurance products are focusing on risk of hurricanes tracked by wind-speed and barometric pressure, earthquakes tracked by the magnitude and even flight cancellations and pandemic/epidemic outbreaks.
Similarly, in traditional insurance, important initiatives are focused on automating parts of the insurance claims cycle such as the initial notification that an insured client makes when a damage has occurred (FNOL: first notification of loss process). Successful automations may lead to improved days-to-pay, reduced processing costs and growing customer demand. In the auto insurance, improved processes and automation can lead to claims processing cycles being reduced from 15 days down to 2 days.
(2019, 2017 Future of Claims Study, LexisNexis)
New Data Needs
Automations of the claims’ cycle often relies on historical and (near) real-time data parameters:
- that indicate the level of damage on the insured asset, such as vehicle data that can indicate the severity of a car crash or parametric devices measuring flooding at a location,
- that track metrics that correlate with the insured asset, such as footfall measurements for a mall operator or internet outage for logistic companies. In Parametrics, such complementary data can serve as checks to the triggering event. As Head of Product at parametric insurance provider FloodFlash Nyasha Kuwana puts it: this is a way of “looking out the window” to make sure context agrees with flood sensor measurement.
The current pandemic has shed light on the largely uncovered insurance space of business interruption. Depending on the type of business, business interruption may be measured and approximated by footfall measurements (retail), commercial fleet movement (logistics), shipping traffic (trade)- rather than just relying on accessing a company’s financials to measure the effect of disruption. In many such instances, insurance companies are building appetite for a growing set of third party data
In parallel, the broader insurance ecosystem is responding with insurtech companies providing insurance-tailored specialised data products tracking and forecasting hyper-localized and hyper-temporal floods, property information, temperature, internet connectivity and other parameters that insurance models are sensitive to.
New Data Requirements
The opportunities unlocked by accessing real time data and these new sources of third party data places new challenges to an insurer:
- Data Sourcing: identifying and selecting the right provider
- Data Integration and connection maintenance: 3rd party integrations and trouble-shooting for integration problems and downtime
- Data Preparation: cleaning, aggregating and normalizing data
- Data Intelligence: data analysis including AI/ ML processing
These are processes that consume a lot of time from highly skilled personnel and divert attention from core business and value adding insurance tasks. In order for an insurer to remain competitive and leverage the value of data to enable these new possibilities, collaborating with companies that can deliver part of these tasks is often key.
Mobito is engaging as a data partner with insurers with advanced data needs. The Mobito Data Platform allows insurers to outsource a lot of these “data logistics” and offers a streamlined and centralized way to access and maintain connection to diverse data. Specifically, Mobito provides access to a diverse set of insurance focused data including precise flood forecasting, property attributes and air quality as well as mobility focused data ranging from driving behavior to footfall measurements and oil tankers routes.
We are looking for new challenges and working on innovative insurance products. Are you an insurer or a relevant data provider and interested to collaborate?