Ernst Rauch explains the complexities involved in windstorm modeling of re/insurance portfolios

In spite of the significant increase in losses from major natural catastrophes(1) in recent decades, earthquakes, windstorms, or floods are, in the statistical sense, 'rare events' for the insurance industry, particularly when it comes to assessing the risk for regionally limited portfolios.

This sets limits to the traditional underwriting methods of answering the two questions central to the assumption of catastrophe covers:

1) What is the risk premium rate, i.e. the risk-adequate price for the assumed liability?

2) How extreme is the accumulation loss potential or probable maximum loss(PML)?

For burning cost calculations, which determine the price on the basis of past loss experience, the record of historical losses is not sufficiently comprehensive as a rule. In most cases, it does not include any very large losses at all because, for one thing, the return periods of significant and (needless to say) historic natural catastrophes are often several hundred to thousand years and more. What's more, the coverage conditions applied in the insurance industry in the past were subject to constant change. Similar problems ensue from mathematical-statistical loss models which use historical data as a basis for forecasting future losses from statistical loss distributions. Major uncertainties ensue in these processes too, because the basic information is incomplete (past losses) and because the plausibility of the results is not analysed on the basis of scientific (seismological, meteorological, hydrological) boundary conditions.

Furthermore, given a changing risk situation (e.g. as a result of climate change), purely statistical modeling methods potentially lag behind the actual loss development.

Risk models

Today's most common way of achieving a (partial) solution to the problems involved in assessing natural catastrophe risk is to fill in the initially missing (loss) information using simulation models based on physical data.

The origins of this modeling method go back to the 1960s, but it was not until 20 years later, at the end of the 1980s, that the development of commercial risk models for use in the insurance industry began to make headway as a result of enhanced and more affordable computer performance.

These techniques owed their final breakthrough to nature itself. The higher and higher record levels of insured losses from catastrophic events like Hurricane Hugo in 1989 (insured market loss: $4.5bn), the 1990 series of wind storms in Europe (insured market loss:$10bn), Hurricane Andrew in 1992 (insured market loss: $17bn), and the Northridge earthquake of 1994 (insured market loss: $15bn) spelled out that the risk assessment methods applied up to then did not always provide results that were in line with reality.

A common feature of all natural catastrophe models is the definition of the concept of risk incorporating the following three parameters: hazard, vulnerability, and (insured) values.

Hazard

This component denotes - in simple terms - the proportion of the risk accounted for by nature (geophysical and meteorological processes). In deterministic models, this usually means an historical or hypothetical natural catastrophe (earthquake, windstorm, flood) which is described in detail by way of its intensity and geographical dimensions. The concept of hazard is more common in connection with probabilistic methods of analysis.

Hazard is defined as the expected maximum intensity from a natural hazard event (usually the peak wind speed in windstorm modeling) within a specified period of time.

Deterministic factors

The aim of the deterministic modeling method is to estimate for selected historical and/or possible future events ('scenarios') the resulting 'as if' losses for an individual risk or a geographically distributed portfolio.

The term 'as if' means that the resulting losses are applied to present conditions taking into consideration the factors of inflation, demographic trends, market penetration, and other influential components determining insured loss. As the result represents no more than a loss amount - possibly with the pertinent uncertainty margin accounting for the estimate - without any indication of the probability (return period), this method is in essence only suitable for assessing very large loss potentials (worst cases) or for determining the loss amount from historical catastrophes applied to current value relationships.

The real strength of geophysical-meteorological models is to be found in probabilistic simulation methods, which are much more sophisticated and comprehensive than deterministic methods.

Unlike the loss assessment for selected individual catastrophe scenarios, probabilistic modeling also produces statements on loss probabilities.

They are usually represented in the form of a PML curve (see Figure 2) or an exceedance probability (EP) curve. The scientific nucleus of this model is a hazard module devised to estimate the return periods of a large number of possible windstorms in the geographical region under observation.

The most common method involves the statistical analysis of historical events and their extrapolation including scenarios with lower occurrence probabilities. Meteorological boundary conditions define the upper limits in terms of the maximum intensities to be expected in an event.

At the time of writing (mid-2004) there is no method for the probabilistic modeling of the windstorm hazard that is generally accepted in scientific circles. The windstorm risk models that are commercially available or have been developed by Munich Re, for example, therefore differ to a greater or lesser degree in the modeling methodology they use and consequently in the results of the hazard simulation.

Vulnerability

Vulnerability is a parameter that represents the correlation between the intensity of a natural hazard event and the loss to be expected for a certain type of risk. This correlation is usually described by an exponential or hyperbolic function, i.e. the loss or damage to a risk does not increase linearly with the increase in intensity (e.g. wind speed).

It should be noted that the modeler does not have any access to empirical loss experience, vulnerabilities will have to be estimated, e.g. by way of 'expert knowledge'. In such cases, the focus is often on structural damage. In practice, however, the contribution to the overall loss made by the non-loadbearing elements of an object (contents, extensions, machines, etc.) can be just as high or even higher as far as insured losses are concerned. The use of real loss experiences therefore produces a more reliable loss assessment than theoretical loss curves.

Even if uncertainty margins are applied to loss curves (cf. Figure 3), the construction methods and the exact execution of individual risks (insured objects) differ so widely that the plausibility is analysed as well, and a comparison of the loss experience with similar risks (in other countries too) is essential for the quality assurance of the model's results.

Insured values

Besides hazard and vulnerability, the third major factor of influence on the loss to be expected from natural hazard events is the description of the object or portfolio to be insured in geographical, constructional, and underwriting terms. The application of the 'correct' loss function, for example, depends on the re/insurer being provided with sufficiently precise information on the type, method of construction, age, construction materials, building codes applied, etc. The same goes for the correct geographical assignment of the location/locations of the risks to be covered, which is essential for the assessment of the hazard. Finally, a complete description of the insurance terms and conditions (e.g. the policyholder's deductible or the insurer's limit of indemnity) is also required. The CRESTA system (www.CRESTA.org) provides a tried and tested framework for formatting liability data for many countries.

Reference

1. Natural catastrophes are classed as great if the ability of the region to help itself is distinctly overtaxed, making interregional or international assistance necessary. This is usually the case when thousands of people are killed, hundreds of thousands are made homeless, or when a country suffers substantial economic losses, depending on the economic circumstances generally prevailing in that country (see Munich Re's publication TOPICSgeo, Annual Review: Natural Catastrophes 2003).

- Ernst Rauch is Head of Department Weather and Climate Risks, GeoRisksResearch Division, Munich Reinsurance Company, D-80791 Munich; Tel: + 49 89 3891 5286; E-mail: erauch@munichre.com.