Why resolving data quality issues should be a priority for the C-suite in 2024
Why resolving data quality issues should be a priority for the C-suite in 2024
For insurance firms, data quality is a critical linchpin that, when poorly managed, hinders overall efficiency, competitiveness, and​ progress, says Jesse Power, Head of Insurance at data automation firm, Duco.

While the insurance industry is a vital ​component of​ the global financial services landscape, its path towards embracing automated solutions has been a journey marked by surprising inaction, setting it apart from other sectors​. ​​The primary culprit behind this slow progress lies in mismanaged data. ​  ​​However, the tides are turning as market participants recognise the impact technology can have on their business and realise the costs of further delays.​  

In this article, we delve into the main obstacles that have​ hindered​ the use of automation in insurance, the remarkable shifts underway, and how automation is rewriting the rules in one of the oldest sectors of financ​ial services​​​.   ​​

Data is the lifeblood

​Anyone following the technological transformation of financial services has no doubt heard the expression ‘data is the lifeblood of financial services’ or some equally familiar cliché.  While this ph​r​ase might sound overused, it remains relevant to the challenges currently faced by many companies, particularly those in the insurance sector. While data is widely recognised as the essential lifeblood, the aspect of maintaining its quality is​ too​ often overlooked​ or at least not correctly prioritised at the right levels within organisations​.   

Where the market is adapting 

Data quality is essential for the successful integration of automation​ in insurance ​because it underpins the reliability, efficiency, compliance, and overall effectiveness of automated processes.​ Insurers know that without a focus on improving data quality and transforming their operating model (including a robust data quality & controls framework), full automation across their operations will always be out of reach. ​

One notable example of this ​recent ​commitment to​ improving​ data quality ​in the industry ​is evident in Lloyds Blueprint ​Two programme​, which has been in the works for the past few years. The project recognises the urgent need to move away from legacy manual processes ​and ​towards a faster, more efficient​, and automated digital​ process.   

A vital component of this transformation​ at Lloyds​ is the Core Data Record (CDR), a central repository designed to​ collect standardised, quality data, harmonising​​​​​​ ​disparate data sources. This consolidation underscores not only a big step towards improving data quality but also that it has the potential to be a source of substantial revenue for the industry​, making it faster and cheaper to do business at Lloyds​

Regulatory reporting​ – driver of change​  

Every insurance company irrespective of size or speciality ​holds both assets, liabilities, and​​ reporting on these​​ is​ at the centre of their business. ​Therefore, keeping ​pace with evolving regulatory requirements is essential.  ​​Data quality is intrinsically linked to regulatory reporting, and getting this wrong can have serious financial and reputational implications. We are seeing this more and more in the industry, where new r​egulatory initiatives such as​ T+1 and the upcoming​ EMIR refit​ are putting even greater demands on companies, reducing reporting timelines, and therefore increasing the focus on improving data quality controls and automation.​​

These ​upcoming reporting deadlines need to be considered by insurers in their quest to ​improve​ data quality​ controls​, ​automate​ operations, and​ meet​ compliance standards. ​​​​​ ​​Another recent industry example was the move this year to IFRS 1​​7​, the new global accounting standard. This ​new reporting ​standard has​ had​ a major impact on how insurance companies report their financials, ​often requiring a transformation in operating models driven by reduced reporting timelines and an increased need for data granularity. This enhanced level of granularity, again, ​drives a shift towards more efficient, automated data management,​ and controls​ processes.   

As the insurance industry adapts to these changes, the quality and accuracy of data used in these processes are hot topics of discussion, with a direct bearing on compliance and operational efficiency. ​​​​​  ​​

Practical examples: ​Inter-system reconciliation & ​Bordereaux​​ management​  

Inter-system reconciliation is a critical process in insurance, ensuring that data across various systems align​s​ accurately. ​These d​ata quality​ controls and processes​ guarantee​ the precision and consistency of financial records and policy information​,​ ​reducing​ the risk of errors, fraud, and regulatory non-compliance.​ For example, checking and reconciling data is an integral part of any claim’s payments process. ​  ​​

Automating reconciliation processes​ fosters operational ​efficiency and​​ ​lowers operational costs. High-quality data​, which is the result of a robust reconciliation framework,​ ​not only ​enhances decision-making​, but​ ​also improves ​investor confidence​ and the overall customer experience​​. ​  

Similarly, high-quality bordereaux data streamlines operations, minimises errors, and supports fraud detection. Moreover, it facilitates valuable insights for optimising underwriting and portfolio performance, ultimately influencing reinsurance negotiations and business sustainability.  In an era where data-driven decisions are the norm, this stands as a cornerstone of insurance operations, safeguarding financial stability, regulatory adherence, and customer trust.  

Looking ahead

Ensuring data quality, reliability, and modern data architecture are amongst the main roadblocks for companies looking to scale, including the top performers. Data quality is not only a technical concern, but also strategically crucial for the insurance industry. It ​serves as the​ foundation for the sector’s capacity to adapt to new regulatory standards, streamline and ​automate operational​​ ​processes, and most importantly, maintain its hard-earned reputation and financial integrity.  As insurers navigate the sea of data and embrace modernisation, they must consistently place​ data​​ quality​ at the forefront of their strategic agenda. In this data-driven era, it’s not just an edge – it’s your lifeline to success as it can influence decision-making, ​improve ​risk assessments, and​ enhance the​ ​end ​customer​ experience,​ right at the heart of it all.  

About the author:  Jesse Power is Insurance Director at data automation company, Duco.  He  graduated from University College Dublin with a degree in Actuarial Science and Finance. He began his professional career at Millman as an Actuarial Consultant before moving to Moody’s Analytics as Director and Sales Manager for Insurance Solutions across Europe and Africa. Early last year,  Jesse joined Duco and was tasked with further advancing the insurance offering and strategy and collaborating with existing clients including WTW.  He  is a Fellow of the Institute of Actuaries (FIA) and a Chartered Enterprise Risk Actuary (CERA). 

Share this article: