Gavin Lillywhite explains why it’s time to stop talking and start acting to improve data quality and accessibility.
The London market has spent at least a decade eulogising technology.
From conference keynotes to Lloyd’s management addresses, to thought leadership papers, and articles like this, our enthusiasm for “embracing” new systems and solutions would make Edward Lloyd, a Protestant elder as well as founder of the famous Coffee House, blush.
However, scratch the surface and the actual rate of implementation belies the talk. The London market is falling behind the U.S. and well behind personal lines-focused peers.
Paper may have gone, but email and PDF attachments now reign, legacy systems abound, and software systems are siloed between lines and functions.
If data is “the new oil”, we’re not drilling in some of the most promising fields, refining, and storing it properly, nor delivering it to those who need it to keep powering ahead.
Poor data quality and data management are hampering (re)insurers’ drive for efficiencies and growth.
They’re also jeopardising the accuracy of risk assessments and pricing models, potentially resulting in either over- or under-pricing and capital inefficiencies.
The former means lost opportunity, while the latter causes profit - and even capital - depletion. Inaccurate data can also lead to missed fraud indicators, an issue that is important to resolve in the softening market.
The main challenge
The main challenge isn’t a lack of data specialists, although that’s certainly an issue. The biggest problem is that the data isn’t clean, complete, structured, and authenticated, or in accessible storage. That makes it difficult to integrate it easily and automatically into processes.
The industry could get away with this during the hard market. Reinsurers that invested little or nothing in technology have generated stellar returns in recent years, thanks to rising rates and adjustments to attachment points, and insurers have also largely enjoyed a sustained period of profitable growth.
But with property cat reinsurance rates down an estimated 10% to 15% at recent renewals, according to Gallagher Re, and Marsh reporting a 3% decline in global Q1 commercial insurance rates, this hard market complacency has to end.
Carriers that previously benefited from an “offer it and they will come” environment will have to work harder for premiums, and that applies to reinsurers too, given the abundance of capacity in the sector.
Where tech can help
The technological solutions are here. AI and machine learning can automatically ingest, validate, and aggregate data. With the right coding and prompting, AI and ML can also use clean data to give (re)insurers better insights into risk profiles and performance predictability. It has the power to open up new risk opportunities and improve insurance penetration and relevance. Better data quality also enhances customer experience and can potentially improve retention rates by ensuring the accurate and timely processing of claims and overall service delivery.
Additionally, investing in data cleansing and management tools (whether buy or build) can significantly improve operational efficiency, reducing the resources spent on correcting errors and streamlining processes. These process efficiencies also open opportunities that were previously too costly to write.
As we increasingly see big corporate insurers integrate large corporate and specialty with mid-market commercial portfolios with one face to the market, this is becoming even more important. Maintaining high data quality is also essential to avoid regulatory breaches, fines, or reputational damage, and for brokers to mitigate the perma-worry of E&O claims.
Examples
Utilising data through tech in order to streamline workflows, boost efficiency and reduce costs is imperative in the current market. Whilst plenty of exciting and innovative examples exist, they will only be worthwhile if they are making use of quality data. The old adage, “garbage in, garbage out” still rings true.
When we get it right, the industry has the potential to, for example, automate risk assessment using geospatial technology feeding into a centrally accessible data pool. This allows underwriters to write more business than they could previously contemplate. It also ends unpleasant surprises in the form of claims from second, or third-tier properties lurking within a large commercial portfolio that underwriters didn’t have the capacity to risk-assess.
Other examples include “digital twins”, or copies of assets created on a centralised platform, which allow for real-time underwriting adjustments and facilitate accurate rating models with realistic reinstatement values. They form part of the solution to a perennial problem at Lloyd’s and help ensure a fair premium is being paid for the asset insured.
Real-time data culled and made accessible through Internet of Things (IoT) technology can build carriers’ role as genuine risk management partners and potentially allow them to differentiate themselves by sharing the rewards of prudent risk management via premium reductions.
Data from such technology also allows us to close the protection gap and start affirming rather than excluding. A notable example of where we can make better use of data, IoT and new loss-run models is in cyber physical property, a hugely-untapped and little-understood area of emerging risk. We now have the capability to build new models and provide affirmative cover for Malicious Cyber Physical Damage, which the market is slowly waking up to.
Within claims, computer-aided fraud detection isn’t new, but data that can flow freely between departments pre-empts situations such as a recent case I learned of where a claims team was poised to indemnify a policyholder before they had actually paid their premium. Additionally, drone-based site inspections – with the data shared - cut the time and cost of claims processing and provide valuable information for actuarial, risk management and underwriting colleagues.
While these examples apply primarily to insurers, they also apply to reinsurers.
Firstly, primary insurers are typically the first line of defence and innovation drivers. By ‘digitising’ their portfolio underwriting, they should find this results in better underwriting performance resulting in less claims to the treaty which, in turn, should result in lower reinsurance capital costs and better pricing. Or put another way, that capital saving can be deployed to build new capacity for emerging risk.
Secondly, specifically to reinsurers, they can utilise AI and LLM’s to identify portfolio trends and future predictions quicker, collaborating with primary markets to help arrest under-performing portfolios faster and avoid the need for massive corrections; one such example is the continuing impact of US social inflation
It’s time to take action
A more proactive approach to tech and data adoption from industry executive leadership would strengthen underwriting margins, build carrier-policyholder relations, and accelerate sustainable market innovation, growth, and future relevance. However, this takes a mindset shift, including the willingness to push boundaries and move the market from exclusion to solution.
A decade after the launch of the Target Operating Model in the London market and almost six years after Lloyd’s Blueprint One, it’s time to move beyond talking about “embracing” technology to embedding it in our day-to-day operations. The rapidly softening market means we have little choice.
Gavin Lillywhite is SVP, Operating Leader, UKI and Europe at Xceedance
No comments yet