In January 2016, the European directive known as Solvency II went into effect—and created quite a challenge for European insurers, including those in the United Kingdom. The directive lets financial institutions use their own internal models to determine the correct capital level to meet their solvency capital requirements. For insurers, that means making sure they have the capital to cover all potential claims.
The new directive made it very clear that those internal models must be based on data that is accurate, complete, and appropriate—and reviewed annually. Data quality and maintenance can no longer be sidelined, because they’re now core parts of capital requirements and catastrophe models. It’s a necessity the insurance industry on both sides of the Atlantic is learning and adapting to on all levels.
Solvency II reinforces the importance of accurate-as-possible data for business use. That applies from high-level financial reporting to the business level. Underwriting, rating, and claims all depend on such data. And it’s a fact that Verisk has built an international reputation by supplying data and analytics for our customers to mitigate risk and improve bottom lines.
At the property level, what does accurate, complete, and appropriate data look like? As with everything related to property underwriting and rating, the location and scale of the risk are central considerations. You need to analyze and study properties up close at a scale appropriate to the issue you’re seeking to resolve. You must look specifically and granularly at the property to understand the risks, exposures, and estimated costs.
In the UK, insurers seem to focus primarily on key perils that present the greatest risk, such as flood, fire, and weather. However, at a “postcard” level (within a neighborhood or block setting), there are often multiple types of structures. For example, a residential house could be Victorian semi, modern detached, or an interwar flat. A commercial property could contain retail operations, manufacturing facilities, restaurants, or office space. Each type of property and structure will exhibit a different reaction to an event and have different repair costs. That’s why you can’t use a postcard-level review—or at least you must view that level of information with caution and most likely take a closer look.
Getting the property information right is crucial to predicting losses accurately and having the right capital reserves in place for potential claims. That’s why data quality and maintenance are key requirements not just for Solvency II but for all underwriting and rating professionals.
Some additional key points:
- Data quality should be measured against the scale of the problem seeking to be modelled.
- Underwriting and risk occur at the property level; property-level data is a must to ensure accuracy and appropriateness.
- A complete consideration of property-level risk can occur only with a robust address database and with independently validated information on each property underwritten.
- Vulnerability and repair costs from shock events vary significantly between properties of different age and type, which reinforces the need for location- and property-specific data.
- To be realistic, data and models must be maintained and updated regularly.
- Continued investment and effort are required to maintain property data and models to make sure they continue to reflect current conditions and reality.
This blog was derived from a series of white papers written by Dr. Jones.
Request a copy of the white paper here.
Information is also available for Verisk’s global insurance underwriting products here and for UK property data and spatial mapping here.