Insurance-Canada.ca - Where Insurance & Technology Meet

Is Data Forcing Modern Systems to Stay in Second Gear?

There has been significant emphasis in the past few years on rapid execution of implementation plans for large systems. Agile methodologies and modern technologies allow significantly shorter timelines from the decision stage to lights-on.  However, the delivery of strategic goals may not follow as quickly, as we cope with our increased need for more, and better, data and analytics.

As always, we are interested in your experiences.

Data Conversion and Data Integration

Modern systems come with modern data architectures that have high capabilities and high requirements.  Much of the data populating existing systems was not subject to as much rigor when the system were being introduced and were further stretched over the years as new insurance products forced compromises to data structures.  In some cases, the data are so dirty that the insurer resorts to physically keying data into the new systems, a time consuming, error prone strategy.

Although software suites are making headway in the market, many insurers elect a ‘best of breed’ strategy which entails buying policy admin systems from one vendor, claims systems from another, business intelligence from a third, etc.  This comes with benefits and risks.

While there are some standards available (ACORD, CSIO, etc.), most vendors elect to develop their own data models.  Ram Sundaram, principal of X by 2, an insurance consulting firm, told PropertyCasualty360.com,  “When you look at business processes and change management, integration is a still a significant problem. It has not been solved.”

Data utilization

Beyond the issues of managing existing data, we are in the midst of a data expansion which promises to continue in exponential fashion and keep data proficiency on the cutting edge of competitive advantage.  These trends will strain existing resources, and will increase the risk of delay as IT and business struggle to put data in the proper role for effective use.

For a recent article in Insurance&Technology, Novarica’s Matt Josefowicz noted that the traditional underwriting process “was designed for a world of information scarcity and is trying to adapt now to information super-abundance.”  This could cause radical changes in how underwriting is done, Josefowicz argues, as new models are developed for pricing risk based on big data.

 What are you seeing?

If you’re in the midst of a systems modernization project, we’d be interested in your thoughts.  Are the new systems value points emerging as quickly as the systems are being introduced, or is there a lag as we get used to new bigger, more complex data configurations?   Are there strategies for a more ‘agile’ adaptation to the new data world?

Give us the facts and your best analysis.