Insurance-Canada.ca - Where Insurance & Technology Meet

New Uses For Data Standards: The Future Ain’t What It Used To Be

We recently posted on the continuing discussion on (versus implementation of) data standards within the independent broker/agent – carrier community.  It may be that at the end of the day, we will realize that the development of data standards failed to meet the original stated purpose in a timely fashion, but turned out to be more valuable than originally suggested.

Without reviewing the entire history, it’s worthwhile to note that the concept of a universal, standards based interface goes back to 1967, when the US national independent insurance  agents association worked with selected carriers to define what was then called the ‘universal terminal’ concept, and set up a not for profit organization – which ultimately became ACORD – to develop standards and protocols.  In 1981, Canadian brokers adopted a similar model, and worked with carriers to create CSIO.

Since then, there has been lots of standards development, and some implementation.  Without entering a debate, it is fair to say that the results are significantly slower in coming than anyone anticipated.  There are many reasons for this, not the least of which are the nuances within specific trading partnerships that defy standardization.  To their great credit, leading brokers and insurers in the US and Canada continue to pursue the original vision of single entry, multiple company, real time interface (see the recent Canadian Underwriter Article “The Reality of Real Time,” penned by two leading brokers, Brenda Rose, Firstbrook Cassie and Anderson, and Sheldon Wasylenko, Rayner Agencies).

However, Data Standards have proven themselves in other ways.  As early as the 1980s, new entrants to the agency systems market began using ACORD/CSIO Data Standards as the data model for new agency systems offerings.  In the 1990s, carriers began using Standards to provide a common language when converting data from legacy systems to more modern technologies.

What developers found was that the richness of the standards, developed by business and technology volunteers, would accommodate virtually of of their requirements, and the standards came fully documented with indexes and search tools.  Most importantly, it was targeted at both the business and technology user.

The most recent variation on this theme is being driven by the need to aggregate data from multiple diverse systems into master records for analytic and customer service vehicles.   The buzzword for this is Master Data Management (MDM).  This is described in a PowerPoint presentation by Davide Loshin of Knowledge Integrity.  As we saw at the 2011 Insurance-Canada Technology Conference, data management is a powerful tool that is just being exploited.

A a recent article in INN points out that business driven governance of the project is key to its success, and data standards go a long way in simplifying the governance approach.

So, while Data Standardization in the insurance industry still has a ways to go in fulfilling the original mandate of broker-carrier connectivity, there are other uses that will  justify some, if not all of the investment to develop and maintain the standards .