Modern Underwriting: More Hats and a Bigger Toolbox

By Zack Schmiesing

There’s a notable level of cautious optimism in the insurance industry about new opportunities in emerging risks that hold potential to drive the future marketplace. The industry has experienced positive combined ratios from 2013 through the first half of 2015, including positive underwriting results coupled with tighter margins, record levels of policyholder surplus, and lower natural catastrophe losses covered by reinsurance rates not seen since the early 2000s (Insurance Information Institute, Dr. Robert Hartwig, http://www.iii.org/sites/default/files/docs/pdf/florida-102715.pdf).

b russoBut many variables are keeping executives awake at night. How will the quickly developing cyber risk affect the traditional property/casualty market? What will coverage look like as the Internet of Things (IoT) continues to grow? How do we understand the challenges of the evolving sharing economy for consumer services? And how will the market change based on the decisions of the sometimes unpredictable Millennials?

The clearest path for setting strategy and developing solutions likely lives in the data—or rather, the big data—that insurers can gather, analyze, and use for developing products. Advances in processing power and analytic techniques allow us all to take the most complex and unstructured data sets and turn the information into valuable results to differentiate within the market. In insurance, predictive modeling and data visualization are now expected skills for success. But who at the insurer is ultimately responsible for making decisions to drive new products—and doing so in a profitable way?

Underwriters Hold the Key

In recent times, underwriters have held that responsibility. They review policy application and/or renewal risk characteristics. They compare coverages and counts against defined exposure thresholds and risk management guidelines. And ultimately, they represent the last line of defense for profitable decision making. But the days of simple address checks, occupant verification calls, and green-screen workstations have disappeared.

Policy fields can be pulled, verified, and auto-filled into virtual policy platforms instantly today. Extensive engineering and protection detail is available at the users’ request, along with loss cost modeling for fire Group I, Group II, and natural catastrophe perils. Distance-to-flood sources, coastlines, and terrorism targets are accessible at point of sale and can be viewed through geographic information system (GIS) platforms at various levels of resolution and detail. Predictive modeling tools and increased emphasis on bundling products are now a major focus of the underwriting process.

Have these sophisticated tools and this amplified processing prowess surpassed the capabilities and expectations we have of today’s underwriter? Does automation and prefilled information supplant the need for large underwriting staffs, or is the role now more important than ever to drive profitability? Do we need to rethink how this role is staffed, trained, and implemented?

Slide 1
Click to Enlarge

Underwriting departments have evolved in recent years into a melting pot of educational backgrounds and prior professional experiences, with underwriters not only reviewing policy submission data but assisting in setting strategies for lines of business across geographies. Staff résumés include data analysts, engineers, economists, statisticians, meteorologists, geologists, epidemiologists, transportation specialists, and sociologists. The skill sets have a major effect on how training is conducted and processes are launched, how target markets are assigned, and especially how teams are built to focus on new products or account for data that’s disrupting existing lines (such as telematics in personal and commercial auto).

Education and experience outside and within the insurance industry can and will have an impact on risk perception as well. Some underwriting shops may opt for strict procedures for the number and types of risks they write, providing clear analog guidance for decision making. This approach removes much of the guesswork and uncertainty from the process. But are those insurers inadvertently limiting opportunity for future business? Other carrier operations provide open access to tools and data resources for their underwriters, train them on the nuances of the business, and give them more responsibility, albeit with more accountability. Access to tools and information can help decision making but can also be daunting when efficiency must be a focal point of operations.

Learning to COPE

The seismic shift in underwriting responsibilities over the last decade has placed more focus on the complexity of individual risks, and we have technology to thank for that. Construction, occupancy, protection, and exposure information—better known to the insurance industry as COPE data—varies drastically from one location to the next, particularly in stand-alone commercial properties. It would not be uncommon to see a gas station, restaurant, and doctor’s office located on the same block in Anytown, U.S.A.

GIS location technology has moved exposure aggregates to more granular levels with greater accuracy. Risk managers now think in terms of feet instead of counties and ZIP codes (which vary in size). Underwriters can quickly determine not only that a business is Joe’s Restaurant but also that Joe’s seats 78 people, has table service, a liquor license, and three OSHA violations recorded in the last seven years. We can now account for the fact that Dr. Smith’s office has fire extinguishers, an automatic fire suppression system, ample flow pressure if the system is activated, and four public fire hydrants within 1,000 feet of the property. And we haven’t even discussed building quality, roof age, or any natural hazards present, all of which the underwriter can access in seconds.

As noted above, GIS tools have become one of the biggest disruptors to the industry. Google Earth’s free version, released in 2005, opened the door for enterprisewide utilization, with only a shallow learning curve for the user and visualizations that would reshape the way we think about geography-based underwriting rules and strategies. Ten years later, predictive modeling, computer learning, and data visualization tools are exhibiting similar adoption patterns. A poll of insurance industry actuaries at the recent CAS In Focus Seminar indicated that 75 percent are aware of, or are actively using, predictive modeling and data analysis software as part of their operations. Programs such as “R” for statistical analysis, “Hadoop” for predictive modeling, and “Tableau” for data visualization are showing signs of becoming the next phenomenon for the insurance industry similar to Google Maps. Investments in data science and analytic capabilities are growing substantially. Will carriers be able to easily and efficiently integrate those tools into their underwriting process?

As new technological and social behavior patterns threaten to disrupt the property/casualty insurance marketplace, the last line of defense—as well as the key to profit potential—exists within underwriting operations. Teams are staffed with a wide range of experience, talent, and responsibility, but they’re asked to adapt more quickly than ever to the new tools and information necessary to make educated decisions. Data science skills may elevate the importance of underwriting staffing as we know it today. Will existing staff be able to adapt and merge underwriting experience and knowledge to realize the full value from both traditional and new underwriting strategies?

Jim Weiss

Zack Schmiesing is the director of thought leadership for commercial lines underwriting for Verisk Insurance Solutions.