As insurers progress in their ability to compete on analytics, business executives are becoming increasingly interested in the quality of their data. No longer a technical matter, data quality has emerged as a vital enterprise discipline with noticeable consequences for the bottom line.
At long last, insurers are moving away from a concept of data quality that applied limited criteria to a narrow set of data within business silos. Now they're embracing a proactive approach to ensuring high-quality data that's intelligible and valuable across the enterprise.
Not only is high-quality data an enterprise asset, but poor-quality data is an equally costly corporate liability. Accordingly, insurers are now standardizing data creation across the enterprise and instituting structures for the responsible stewardship of data at various levels.
In this 45-minute Webcast, our presenters will discuss vital questions shaping the new thinking about data quality, including:
- Who owns the issue of data quality within the organization?
- Where and when is data quality addressed?
- What tools are used to ensure data quality?
- What is the impact of poor data quality?
- What governance is in place to ensure data quality in the future?
- Which data quality metrics are most important?
Mark B. Gorman
CEO and founder of The Gorman Group Insurance Consultancy
With 30 years of industry experience, Gorman's focus is on assisting both insurance carriers and vendors in furthering the adoption of emerging technologies in the insurance market.
Executive Editor of Insurance & Technology,
O'Donnell has covered technology in the insurance industry since 2000, when he joined the editorial staff of I&T.
Global Product Marketing Manager for Data Integration Products at SAS
Hausman brings diverse experience to this discussion. His career path included biomedical engineering and SAP consulting before working as a solutions architect and a systems engineer.