Over the last decade or so, the role of predictive analytics in the property/casualty insurance industry has evolved into a critical component for success. That trend is only going to increase in the years ahead, as analytics becomes a necessity in marketing, operations, cash flow management, risk assessment, pricing, and claims.
At the same time, to achieve accurate and effective analytical insights, insurers have learned they must first start with relevant, quality data. While good data might be able to make up for mediocre analysis, bad data or data implementation always leads to poor results. What constitutes good data? Good data characteristics include:
To ensure data quality, insurers must collect and document quality and usability measures about each data source and data element. But as my colleagues Darlene Pogrebinsky and Gerry Gloskin and I explained previously, that’s just the first step — there’s more to data management.
One more thing companies have found out over the last many years is, when it comes to data, size matters, and bigger is better, according to "Strategic Guide to Big Data Analytics” from the editors of CIO.com. One insight of many to be found there: “Technology leaders should adopt the attitude that more data is better and embrace overwhelming quantities of it,” says Perry Rotella, CIO at Verisk, “whose business involves ‘looking for patterns and correlations between things that you don’t know upfront.’”
So, not only is good data necessary for good analytics, but the bigger the data and the better the data, the better the analytics will be. To find out how to begin an effective analytics strategy, check out my next blog coming soon.