Visualization: Experiencing the Insurance Data Aha Moment

By Nana S. Banerjee

In what is one of the most memorable scenes of the 2001 Oscar-winning movie A Beautiful Mind, John Nash (played by Russell Crowe), the Nobel Prize–winning mathematician, is shown cracking encrypted enemy telecommunication by looking for patterns in magazines and newspapers to uncover a secret Soviet plot. Of all the mathematical references throughout the movie, that scene stands out as chilling and powerful — possibly because of the natural ease with which John Nash effortlessly connects the patterns between texts and numbers hidden all around him. Notwithstanding the context of the scene itself, set at the onset of John Nash’s lifelong fight with schizophrenia, there are very few examples that illustrate better the power and dynamism of data visualization.

Danish physicist Tor Nørretranders, in his book The User Illusion: Cutting Consciousness Down to Size, converts the “bandwidth of human senses” to computer terms and explains just why data visualization is the most powerful form of data interpretation.

The User Illusion: Cutting Consciousness Down to Size


Source: The User Illusion: Cutting Consciousness Down to Size by Tor Nørretranders (Penguin Press Science)

Using an excellent data visual (see above figure), Tor demonstrates that when assessing the “language of the mind,” the sense of our sight operates at an order of magnitude faster than the sense of touch (similar to the bandwidth associated with a network of computers), which in itself operates at an order of magnitude faster than the sense of smell, and so on. All other senses being self-explanatory, the bandwidth associated with the sense of taste is similar to a calculator, and the small white box at the bottom-right corner shows that we are aware of only 0.7 percent of the overall, when all this processing happens.

Currently, for any company that deals with a titanic amount of data, data visualization is and will remain an absolutely fundamental tool for practicing its trade. While insurers have long been skilled in collecting information, they now generate and acquire exponentially growing, disparate, and complex quantities of data — and depend on them, perhaps in some ways for their very survival, in today’s marketplace.

Much of the talk today about data management and analysis and its effect on how business gets done targets the science of analytical modeling — and in that pursuit, as far as insurers indeed have come, they’ve still got farther to go. The industry spends considerable resources and energy collecting and storing data, not just because it produces a lot of it but more likely because the regulatory environment mandates storing much of it. The industry hasn’t spent nearly enough effort aggregating its data across functional silos, integrating internal data with third-party data, analyzing the data, and distributing the resulting insights to people who can take action on it. For example, imagine a claim for hail damage to an auto on a given day in Tulsa. The insurer would invariably want to know: Would a claims adjuster at my company know the garaging address for the policy listed on the underwriting system? Let’s say maybe. Would he or she know what the weather was in Tulsa on that day? That’s another maybe. Would all these pieces of data come together at the same time and be presented to the adjuster in a way that’s easy to analyze? That’s highly unlikely. And that’s exactly where visualization becomes so helpful.

No wonder that for a company such as Verisk Analytics, which deals with a titanic amount of data, with a focus on making such data useful, data visualization is and will remain an absolutely fundamental tool for practicing our trade. It is thus not surprising that the scientists, analysts, and actuaries across Verisk have consciously embraced the idea of becoming lifelong learners and are committed to becoming continuously better at how we uncover patterns and connections hidden in the data and then display what we aim to convey of our analysis. Technology and software help us to do that more effectively and efficiently.

As consumers of information, we are all demanding visualization. It allows us to map the information in a way that leads to better decision making that’s easier and faster. The 2012 InformationWeek Business Intelligence, Analytics and Information Management Survey conducted in late 2011, indicated that nearly half (45 percent) of the 414 respondent’s cited “ease-of-use challenges with complex software/less-technically savvy employees” as the second-biggest barrier to adopting BI/analytics products, fractionally behind the biggest barrier, “data quality problems,” cited by 46 percent of respondents.

Mother Nature’s Infographics

Catastrophe risk management has come a long way in its 25-year history, and the sophistication of those analytics goes well beyond the numbers on a database. The end result has been fast, intuitive insight into what drives risk.

Looking back to Superstorm Sandy, healthcare officials in New York City, in advance of the storm, were trying to decide whether to evacuate hospitals. In the end, many chose not to move patients before the storm. Unfortunately, numerous hospitals were then catastrophically flooded, and patients had to be moved during the worst of the barrage.

Certainly there are myriad attributes that go into assessing a situation like that, but as analytics and their visualization become increasingly sophisticated, they will be able to help risk-bearing organizations, including insurers and local authorities, develop appropriate prescriptions for mitigating risk — by providing the contextual detail for better-informed decisions.

Today’s advanced climate models are capable of effectively projecting the impact of storms as they get closer to coastlines or geographic regions. Such models can assess the total number of policyholders expected to be affected, when an event is expected to worsen, or when it will be safe for insurance personnel to move into the area. The visualization models help enable the decision maker or assessor to get down to an individual building level, often facilitating a preplanning process and allowing companies to communicate proactively with policyholders so they can take certain loss control measures — such as boarding windows, reducing chance of fire, and so on — to mitigate damage. Such models are also allowing insurers to readily project and visualize the impact of fallen trees on the power lines serving a group of policyholders.

Given the complexity associated with climate change and the inherent difficulty in the assimilation of evolving and interdependent data, our dependence on a sophisticated and constantly improving visualization capability is far too great to be denied.

The Sight in Business Insight

Unquestionably, the tried-and-true bar, line, and pie charts have served us well. But when the complexities of relationships are more nuanced and the data becomes more unstructured, visual analytics need to become more dynamic, multidimensional, and customized. For property/casualty insurers, visualization can help identify a range of data issues quickly — from a high-level snapshot of where exposure is located to exposure composition and completeness, including breakdowns by occupancy and construction, building age and height, or geocode quality.

Visual link analysis technology helps discover critical, previously hidden connections within data. Seeing those connections — within proprietary data, in data from external sources, or through a combination of sources — provides insight and knowledge to make decisions. The technology finds all data elements applicable to a question and draws a picture of the connections among those elements, revealing relationships previously invisible. The contextual approach provides a multidimensional understanding of policy performance, consumer behavior, and industry trends.

Data integrity can be a significant problem for large organizations, especially where multiple, complex databases are involved. Mapping techniques often find thousands of errors in a fraction of the normal time. Mapping also finds red flags in claims data. State insurance fraud bureaus frequently use visual link analysis to assist in their fraud investigations. For example, the user can map each insurance claim, and the technology immediately identifies irregular patterns revealing potential fraud. Seeing those connections — within company data, in data from external sources, or through a combination of sources — can give claims investigators insight and knowledge to help make decisions that benefit a company’s bottom line.

Another insurance application where visualization comes in handy is commercial and personal auto. Telematics programs in place use sensors to determine factors as simple as distance (vehicle miles traveled) and as sophisticated as camera-based recording. Devices transmit and store the resultant collection for immediate or deferred analysis, meaningful interpretation, and/or visualization. In terms of how large that kind of big data and its visualization may loom, the usage-based insurance (UBI) opt-in rate is expected to increase to 20 percent over the next five years, according to one recent industry poll. Other polls consistently show that two-thirds of consumers are open to telematics-based insurance policies, especially if there’s the potential for premium discounts. Keep in mind that for newer consumers of vehicle insurance — the Gen Y’s and the Millennials — the use of that type of consumer technology is almost expected.

Insurers Say Aha!

Henry David Thoreau said, “It’s not what you look at that matters, it’s what you see.” With data visualization, the significance of the quote is quite literal. It’s a fully formed discipline that requires multiple skills — among them, the knowledge of statistics, ideas of space, design and topography, and a deep subject matter expertise in the insurance sector. 

Dr. Nana Banerjee is a group executive of Verisk Analytics (Nasdaq:VRSK). He serves as president of Argus Information and Advisory Services and as chief analytics officer of Verisk Analytics.