By Scott G. Stephenson
Unfortunately, as recent weather events have proven, there may be as much fact as fiction in the old classic The Wizard of Oz — particularly when Dorothy's Kansas home suddenly changes ZIP code. And if the great and powerful Oz had any chance of living up to his reputation, he would have needed a lot more resources than he had access to behind that curtain.
The threat of lions and tigers and bears pales in comparison to Andrew and Katrina and Sandy. The only way to manage hurricanes, tornadoes, and wildfires is through knowledge: Insurers must be able to assess those types of risk, and homeowners must ascertain an appropriate level of coverage to protect themselves.
It's a fundamental principle of insurance to understand exposure to loss and price accordingly. But when the exposures seem to be changing, it's difficult for pricing to keep up.
One example of changing exposure is the hail peril, which has been a major driver of recent catastrophe losses. In fact, according to a Verisk Insurance Solutions – Underwriting analysis of A-PLUS™ data, between 2008 and 2012, hail caused more than 4.5 million claims resulting in $32.1 billion of insured losses — an average of $6.4 billion each year. That's almost three times higher than the preceding five-year period between 2002 and 2007, when average annual hail losses were $2.3 billion.
In the past, many insurers viewed losses from severe thunderstorms as a consistent and manageable portion of their annual losses. In those days, the increasing costs of such storms seemed to be driven simply by the cost of repair materials and labor. However, as the frequency of major storms increases, the costs multiply rapidly. If losses from the peril continue at the current rate, severe thunderstorm losses may become unsustainable for some insurers.
Fortunately, there are actions insurers can take to stem homeowners losses while securing policyholder property. Those actions involve a comprehensive approach to homeowners risk management based on understanding the current exposures more completely and applying prices more precisely.
Over the past ten years or so, the homeowners market has grown increasingly competitive in some regions, while other areas seem to have fewer competitors. In today's market, thinking a house is a house is a house would be disastrous to an insurer's bottom line. To compete, insurers have been trying to get a better understanding of how closely their premiums match their risks. And they're looking for better ways to predict future losses so they can identify their best risks and price all the risks in their book of business more accurately.
The use of advanced analytics is leading the way in increasing predictability and developing new underwriting and rating approaches for homeowners. In the ongoing search for better ways to develop equitable rates that accurately reflect the experience of a group of insureds, more and more companies are looking to create their own predictive models, work with consultants to do so, or purchase models from vendors.
One way companies are using advanced analytics in homeowners is to develop rating plans that rate separately for the individual perils covered by the homeowners policy, as opposed to the traditional approach of rating all perils as a package.
Why are by-peril rating factors more accurate? The relative importance of perils can vary for a number of reasons. For example, fire is a more important issue in areas that lack adequate public protection. Hail can be a significant factor in Nebraska or Hail Alley but less so in other places. There's even more evidence of changing peril contributions. Homeowners losses used to be predominantly driven by the fire peril. Now, in many locations, water and wind losses exceed fire losses. By-peril rating factors allow for more explicit recognition of the effects of perils varying by location, and they react more dynamically to changing peril contributions over time.
By-peril rating also allows insurers to demonstrate the exposure to risk more accurately. Different home characteristics relate to peril-specific losses in a variety of ways. By looking at advanced data sets from multiple sources, carriers can develop models that consider various characteristics and determine how they affect the cost of homeowners coverage.
In addition, using data and science, insurers can realize better risk assessment when they target costly physical inspections. The challenge is to find and isolate those risks that require an inspection for proper pricing, underwriting, and identification of risk factors that the insured can address. Traditional guidelines that drive inspections, such as the age or value of the home, are not enough to optimize insurers' inspection spend. So, how can carriers achieve more precision with their inspection programs?
The first step is to determine what modifications can improve the decision criteria that drive the inspection ordering process. Simple rule changes that examine new internal or external data points can help make the process better. For example, detailed weather information can focus on properties within an area that experienced a recent hail event.
With today's predictive analytics and technology, insurers can optimize inspection spend by employing a model that systematically identifies the risks most likely to have hazard conditions or insurance-to-value deficiency issues. A model that feeds inspection results back into the system can "learn" and become more predictive over time as more data is evaluated.
Another important component to help guide an insurer's proprietary inspection process is the use of aggregated industry data. Knowing how your company stacks up against industry benchmarks provides valuable insights into risk-specific decisions. This solution takes more time and effort to develop and implement than modifying current inspection rules but will provide much greater return on investment.
Recent advances in homeowners underwriting have focused extensively on automation. In many cases, simply entering an address into an insurance underwriting application will provide a wealth of information about a property under consideration.
Location-based information is also available in real time. Once an address is geocoded, insurers can automatically incorporate information about fire suppression capabilities, catastrophe risk details, and other critical elements into the underwriting decision. In addition, insurers' use of remote imagery — both aerial and satellite — has gained significant traction over the past several years because staff can confirm a host of essential property characteristics from their desks.
Using aerial tools, insurers are reaping benefits on several sides of the business. Those benefits stem not only from the technology's ability to record exterior structural characteristics (key to assessing potential property losses) but, even more so, from its ability to deliver large quantities of accurate data that can be further analyzed to provide more valuable and customized data. Such capability drives bottom-line returns: greater levels of efficiency, productivity, and customer satisfaction in the claims cycle and payment of fewer illegitimate claims by documenting how a structure looked before and after a damaging event.
In the future, ultra-high-resolution aerial images using advanced remote-sensing technology may well change the art and science of underwriting. The images will provide the necessary validated property information for decision making and eliminate the cost of an on-site inspection. The range of information made available through these tools is remarkable — from structure size and shape to interior wall height and exterior structures and liability hazards.
A bit farther on the horizon but in view are sophisticated analyses based on satellite imagery, with all the benefits that technology provides for property valuation, risk selection, target marketing, and claims settlement. Imagine the potential for trained image analysts or laymen to record any location on earth and keep tabs on it over time. Still other types of satellites are sensitive to vegetation cover, water, or surface temperature and can detect anything from swimming pools to storm and flood damage.
Before a tropical system strikes, officials usually have several days to prepare. But although straight-line winds, heavy hail, and thunderstorms can be just as destructive as some hurricanes, forecasters don't have as much lead time to determine the location and severity of wind, hail, and lightning with as much accuracy. To expedite claims efficiently, claims professionals need to have accurate, granular weather data as soon as possible.
The greatest opportunity to gain or lose customer loyalty occurs during a claim. That fact underscores the importance of timely, accurate data. Policyholders spend little time interacting with their insurer until they need to — and that's when customers form their perceptions of value. In recent years, insurers have focused significant effort and expense on retooling their claims operations to become more responsive, accurate, and efficient — investing in new technologies such as claims systems, interactive call centers, tracking and scheduling systems, and mobile applications. Those efforts to modernize have paid off, as evidenced by steadily increasing customer satisfaction levels during the claims process.
But customer loyalty isn't the only factor driving the need to accelerate the claims process. Loss ratios have soared in the continued soft market, and insurers typically spend 3 to 5 percent of every premium dollar on adjustment expense. The sooner an insurer can deploy a response team, the less claims leakage will occur. There's no substitute for "feet on the street" in accurately assessing damage through photographs and measurements, but the more claims work that can be done from the desk — such as estimating the number, location, and severity of claims — the quicker the process can be brought to a close. Expediting the claims process decreases costs and increases customer loyalty simultaneously.
With the amount of aggregated historical claim data growing larger each year, the future of weather analytics is clearly migrating toward underwriting. Now that weather analytics can be assessed at the census block or even rooftop level (as opposed to the county level), managing individual property risk profiles based on loss history is quickly becoming a reality. Weather patterns are changing; hail, wind, tornadoes, and other perils are occurring more frequently and in unexpected places. In the insurance marketplace of the future, understanding those trends will be of utmost importance.
The value of such knowledge only grows when considering the changing demographics and social and economic structure. According to a recent Conning study, 2013 Personal Lines Consumer Markets Annual, such swings will significantly influence insurance products and distribution. "The demographic changes underway are compounded by the socioeconomic swings, driven in part by the aftermath of the recession," said Steven Webersen, director of research at Conning. "Now more than ever, insurers must stay abreast of consumer trends, activities, and growth opportunities that can guide the development of products and services. To be successful, insurers must understand both the overarching consumer market issues, but also the underlying trends in individual market segments." Among the key groups Conning mentions are high-net-worth individuals, seniors, the youth market, and Hispanics.
With catastrophe events on the rise, changes in weather and climate, and a shifting customer profile, insurers and policyholders need to discover the many ways to overcome the bruising features of a disaster — for both the company and the customer. It's important to recognize the risks we all live with. It's even more important to identify the measures we can take beforehand to reduce the negative impact a disaster can impose.
As Dorothy herself said, "There's no place like home."
Scott G. Stephenson is President and Chief Executive Officer of Verisk Analytics.