Industry Technosavvy the September 2024 issue

Understanding Uncertainty

Q&A with Rob Newbold, President, Extreme Event Solutions, Verisk
By Michael Fitzpatrick Posted on August 28, 2024
Q
This summer Hurricane Beryl became the earliest Category 5 hurricane on record; does this mark a rise in extreme weather events?
A

The value that Extreme Event Solutions (EES) at Verisk provides to the (re)insurance market is quantification of potential loss from weather events. It’s important to note that we’re not forecasting when, and if, these events are going to happen but giving the market tools to be prepared in the unfortunate event that they do. Certainly, we have to acknowledge that climate change is real, and [impacting] events like Hurricane Beryl forming so early in the season; rapid intensification of Hurricane Otis, getting up to 115 mph in just a 24-hour period [in October 2023]; Hurricane Idalia also rapid intensification [in August 2023]; increased precipitation coming from these events. These are all very real phenomena and it’s having an impact on the levels of cat loss that the insurance market is sustaining.

Verisk publishes a report every year giving our estimate of global modeled catastrophe losses. Over the last 10 years or so, our modeled estimate has gone from on the order of $70 billion annually to over $130 billion annually. There are several factors that drive that. Climate change is one piece of it. The increases in both the number of properties constructed in the United States, globally in fact, and the value of those properties due to inflationary cost environments have a direct impact on overall loss. It’s a combination of factors, but it is very real. You see losses going up seemingly every year.

Q
Where have prior catastrophe models fallen short or lagged in recognizing the risks extreme weather events pose across different perils?
A

I don’t know that I would say that catastrophe models have fallen short. I’m going to defend the science and the way risk is viewed. Our perspective is that technology and computational efficiency allow us to provide more applications to give a better quantification of that risk. Ultimately, we are in the business of helping people understand uncertainty.

We can have good data and tools to understand hurricane formation, and once they form, when and how they are going to make landfall, but it’s very difficult to determine for a given event if a tree in your front yard is going to fall or if it’s going to fall in your neighbor’s yard and what that’s going to ultimately mean for your house. You can have two houses right next to each other; one experiences damage and the other does not.

The increases and enhancements that we’re making in NGM (Verisk’s Next Generation Models) allow us to have a better quantification of that kind of uncertainty, we call that secondary uncertainty. The models now, with NGM, allow us to compute much more possibilities of scenario risk should an event happen, to give us, again, a more robust idea, not only of uncertainty but correlations—what happens against different types of insurance coverage, what happens if a given type of policy is impacted and how that flows down to your reinsurance. We can do that on a much more granular scale and using many more calculations now than we could before. That lets our ultimate consumers feel more confident in the views of risk that they are taking on.

Q
Verisk released its Next Generation Models earlier this year. What advances in technology and modeling do those models incorporate?
A

NGM provides the ability to handle larger volumes of data and produce more intense and robust calculations of uncertainty within every model loss calculation we are making. So, for a given property, for a given event, for a given insurance coverage, we’re able to do many more calculations—taking into account many more data elements—that ultimately better quantify that uncertainty and can tell everyone in the insurance value chain, not only an average estimated view of risk, but an uncertainty range around that to give you a better idea of what the probability of loss to your portfolio is.

Essentially, technology has allowed us to rearchitect the entire loss modeling framework to introduce that flexibility and granularity, and importantly, we are able to do that on a global scale. Every model in our global suite has now been rearchitected to perform these more robust and intense loss calculations to better allow more insurance terms and conditions, new conditions that have not previously been able to be modeled at all in the insurance and reinsurance space, and again to do so with more robust financial uncertainty determinations. That’s really what’s key to NGM.

Q
What new data sources are being used?
A

The updated views of uncertainty and correlation are driven by detailed claims data, and that’s across different perils, types of risks, types of coverages, size and magnitude of events. In particular, EES has several hundreds of thousands of claims for U.S. hurricane events going back to 2004 which were used directly in these updates. So, we can put the actual losses that the market is experiencing into the model to get validation that what the model is producing is representative of actual, real-world experience. It’s the truth on the ground to make sure the model is going which way the tree fell, as I mentioned.

To phrase this another way, with the additional terms and structures that we can now model, our clients are able to see the different impacts of those insurance terms and conditions on their data. As we’re getting information about what’s being written in the field, we can put those calculations into the model and give the market, essentially, the ability to quantify what they couldn’t previously do before. So, it’s basically claims data and real-world terms and conditions that we didn’t have in prior versions of the solution.

Q
How does this help brokers, who use models to help counsel their clients?
A

It gives the brokers more information that they can use to counsel their clients. It’s all about quantification of uncertainty. Brokers are counseling their clients on how much risk they should retain, how much risk they can cede. They’re talking to the reinsurance market about what’s an appropriate price to get that risk ceded off to get the appropriate reinsurance protection. The increased insurance terms and conditions, the increased quantification of uncertainty allow the brokers to go to their clients with more confidence, to give them the ability to purchase the right amount of reinsurance protection knowing they have view of risk that’s more detailed and more precise than what they had in prior versions.

Q
How do the new models aid risk management and risk mitigation?
A

In the same way, we are providing tools that the industry can use to quantify more accurate technical pricing of the risk. Obviously, there’s always a business transaction that takes place when you’re ceding that risk or issuing an insurance policy, but that all has to start with a technical view of risk. And the more confident you can be in that technical view, the more certainty you have in going to that insurance or reinsurance transaction, or as a risk manager, knowing how much insurance I have to buy.

Importantly, mitigation is key. We talked about increasing losses affecting the global insurance and reinsurance market. The ability to take proactive measures to mitigate that risk is obviously on everyone’s mind. The more we can mitigate up front, the less actual loss experience the industry is going to experience. As the models drill down to a policy-by-policy layer, and they look at the impact of storm shutters, setback distance, the type of roof a structure has, impact-resistant hail roof, the models can produce a quantification of loss for different mitigation features, that allows you to make the proactive choices to perhaps invest in those features in your home or in your portfolio.

Verisk has deployed this advanced new risk to EES’ global suite of models and this is important because it allows risk managers, carriers, brokers, reinsurers to have a single, consistent financial view of risk that they are using to understand everything from a given home all the way up to a full reinsurance portfolio. That is notable and differentiating… and hopefully gives the market better confidence to transact risk. We are in the business of making the world more resilient and believe that more robust quantification of risk and better quantification of uncertainty on a global scale will help the (re) insurance market do that.

Deicing Solar

With its wide-open spaces, abundant sunshine, and warm regulatory climate, Texas has seen solar power blossom, but severe weather is clouding that picture. The risk was highlighted last spring by hail damage to a 350-megawatt solar installation not far from Houston. Facing potentially expensive hail damage and rising insurance costs, the solar industry is making its installations more resilient to extreme weather.

“Every event that occurs we’re learning more about how to mitigate them, which way you should tilt the panels, how the steepness of the tilt matters, how equipment selection matters,” says Jason Kaminsky, CEO of kWh Analytics, a leading insurance provider for renewable energy. The kWh 2024 Solar Risk Assessment addresses risk management challenges in solar, including how standard models can sharply underestimate weather losses.

Hail claims average $58.4 million per claim for solar installations and account for more than half of costs for solar loss claims, renewable energy insurance specialist GCube Insurance has estimated, citing hail as an “existential” threat to the industry.

In Texas, which is expected to lead the nation in new solar installations over the next five years, hail damage has refocused the industry. Prior to 2019, solar developers had emphasized low cost and efficiency as they rapidly built out the state’s solar infrastructure.

“In 2019, we saw our first really severe hail losses begin to affect the solar industry,” Kaminsky says. In May 2019, a West Texas hailstorm caused about $70 million in insured losses at a 1,500-acre solar project near Midland. That caught insurers’ attention even as the overall property market was hardening, leading to rate and deductible increases.

“On a risk-adjusted basis, prices went up multiples of what they were,” Kaminsky says. That added urgency to moves to make solar more resilient. Protective measures include hardening panels against hail and tilting them correctly during hailstorms. Most solar farms are built to track the sun, but increasing the tilt means hail strikes a glancing blow rather than a direct hit.

Another challenge is reacting quickly enough to a rapidly developing hailstorm to get panels into “hail stow” mode, or maximum possible vertical tilt, in time. Emerging “auto stow” technology seeks to address this risk.

“We’re still learning as an industry,” Kaminsky says. “And while you have to plan for that (hail risk), and you have to insure against it, it doesn’t lead to an existential crisis from a power reliability perspective, in my opinion.”

That’s because solar and other renewable energy assets are built in a distributed fashion so they are essentially independent power plants built next to each other—even single, large facilities. If a portion of a solar farm suffers severe hail damage, the overall plant can continue to produce power.

“So even if you have a damaged facility that leads to a large insurance claim,” Kaminsky says, “at the end of the day, it’s a blip in the grid generation for a state or region.”

Michael Fitzpatrick Technology Editor Read More

More in Industry

Big Buyers Selling to Bigger Buyers
Industry Big Buyers Selling to Bigger Buyers
As the brokerage industry continues to consolidate, is the buyer pool shrinking?...
Industry When Disaster Strikes
Federal agencies are ready to provide assistance for the victims of hurricanes a...
The Opportunity to Watch and Learn
Industry The Opportunity to Watch and Learn
Council board chair Keith Schuler offers insights gained from working closely wi...