The last several years, particularly 1998 and 1999, have been very difficult ones for the insurance industry. During this period the industry incurred catastrophe losses of $10.1 billion in 1998 and $8.3 billion in 1999 which contributed significantly to the industry's poor financial performance. According to Standard and Poor's, the European reinsurance market's combined ratio for this period was “a massive” 131% in 1999 and 109% in 1998. The highest number ever of catastrophe events during the year largely drove the poor 1999 results. As a result, prices are starting to rise and companies are trying to become more aggressive in managing their natural catastrophe exposure.

The question to ask, and the focus of this article, is what can companies do to become more aggressive and effective in managing their catastrophe exposure? Currently, most companies are concentrating their efforts on controlling their overall accumulations and, to a lesser extent, underwriting new business. Their ability to do this has been greatly enhanced through the development of catastrophe modeling software. Most reinsurance and primary companies are now using catastrophe modeling software to manage their accumulations. In addition, many of the reinsurers and some of the primary companies are using the models to underwrite business.

Based on the number of companies using software, it would appear that they are being very aggressive in managing their exposure. The problem is that many companies are not using software to underwrite business, and those that do are not coordinating it well with their accumulation programmes. The net result is that they are not selectively screening business to optimize the use of their available capacity (for example, writing better than average accounts in areas with low capacity and average accounts in areas with high capacity).

How can companies improve their catastrophe management programmes? First, the objective of every catastrophe management programme should be to optimize the risk/return relationship. Insurance 101 will tell you that the way to achieve this objective is to properly identify which accounts should be written or declined (new and renewal), appropriately price the accounts being written and maintain a geographically diversified portfolio. Fortunately, the catastrophe modeling software products currently being introduced are providing the tools needed to underwrite and price accounts, while maintaining a diversified portfolio.

EQECAT's new WORLDCAT enterprise TM software is an example of one of these new models. In this case it provides underwriters with the ability to look at both the key underwriting and pricing factors for an individual account on its own, plus look at the impact that the account will have on the overall portfolio using the same key risk measures. Figure 1 shows the type of risk measures that are being provided by catastrophe models.

The risk measures shown in the top section of Figure 1 are the ones being used by most reinsurers to underwrite and price accounts (for example, annual loss, standard deviation, rate on line and PML shown as a 100, 250 or 500 year loss). This product also provides additional information on the COV (coefficient of variation which is a relative measure of volatility), the probability of penetrating and exceeding each layer of the programme and various points on the loss distribution. It also includes a new item called Calculated Rate on Line (CROL). CROL is a rating tool, which enables the user to develop the appropriate rate for the exposure based on the estimated annual loss, standard deviation and expense load. This is a user-defined function that can vary by type of account or other factors. With all of the information outlined so far a company can effectively underwrite and price any programme on a standalone basis. However, as noted above this is only half of the answer.

The bottom section of the screen provides the critical missing information needed to optimize the portfolio and achieve the overall objective of optimizing the risk/return relationship. This section shows the impact of adding a new programme to the overall portfolio. This is very important since programmes that are highly correlated with the overall portfolio will contribute significantly more to the portfolio's exposure and/or volatility than one that is not highly correlated. This correlation should be reflected in the underwriting and pricing of the individual programme. As a side note, how a model handles correlation is very important, and every company should clearly understand how their vendor is handling or modeling correlation.

While the above approach is a vast improvement over prior underwriting and accumulation management approaches, there are still more advanced portfolio optimization techniques that are available to companies on a consulting basis. Once again, being more familiar with EQECAT's products and services, the advantages of these advanced portfolio optimization approaches will be demonstrated using EQECAT's Exceedance Probability Leverage Analysis (EPLA) methodology. EPLA's goal is to enable the company to achieve its overall objective by evaluating each account on both its own individual characteristics and its relative impact on the overall portfolio as compared to other accounts in the portfolio. The methodology produces an Exceedance Probability Leverage Curve (EPLC) for each account. EPLC provides the account's leverage at the different level of exceedance probability. A high leverage means that the risk-adjusted return for the account is low.

Using tools like EPLA, portfolio managers can determine:

  • Appropriate pricing for an account

  • Which accounts to remove from a portfolio to maximize risk-adjusted return

  • Where to grow or shrink business geographically to achieve maximum diversification benefits

  • How to swap parts of two different portfolios to optimize the risk/return relationship.

    The key element to keep in mind when using one of the advanced portfolio optimization techniques currently available is the robustness of the underlying model. To be truly effective in achieving the intended result requires the use of a very technically advanced catastrophe model.

    The following is an actual analysis that was conducted using the EPLA methodology. For the purpose of this analysis, the corporate portfolio optimization objectives were defined as follows:

    Minimize

  • 100 year to 500 year loss

    Maximize

  • Premium

    Constraints

  • Expected return on capital not less than 15%

  • 100 year loss to premium ratio not more than 10

  • 250 year loss to premium ratio not more than 20

  • 500 year loss to premium ratio not more than 30

    The EPLA analysis looked at each account to determine how much it was leveraging the portfolio, if the revenue was appropriate given the leverage factor for the account and whether or not it should be eliminated, reviewed or retained in order to achieve the stated objectives. The methodology employed in the analysis essentially made a value judgement on the relative merits of each account to determine which ones should be retained and which ones should be eliminated. The analysis also identified areas where additional exposure could be written with minimal impact on the overall exposure. The above table contains the results of the EPLA analysis:

    Obviously, not everyone will achieve results as dramatic as those shown in the above example. However, it is safe to say that every company incorporating one or more of the portfolio optimization techniques will benefit and move closer to achieving the ultimate objective of optimizing the risk/return relationship.

  • Richard L. Clinton, Vice President, EQECAT, Inc.