Tim Beauchamp argues that technology can provide management with a more scientific approach to underwriting.

Arguably, there is now – or there was until September 11 – a convergence of three adverse factors in insurance markets:

  • a surfeit of capital, perhaps to the point where underwriters were complacent about the rates at which they wrote their business, in the full knowledge that if they generated underwriting losses, there would be plenty of capital to fall back on. Paradoxically the remedy is to reduce capital strength;

  • escalating claims experience, and there can't be a better, if sadder, example of the potential magnitude of claims than the estimates now of the recent disasters in the US; and

  • a serious downturn in investment markets and returns.

    The untimely convergence of these three factors is likely to have a decisive impact on the financial well-being of insurers and reinsurers. In turn, this prompts consideration of a range of issues but, most importantly, consideration of the role of IT.

    First principles
    But let us begin with first principles. Underwriters write risks, and accept premium for doing so; out of the fund of premiums their underwriting generates, they pay claims. But which is the chicken, and which the egg? Which comes first: rating the business – in line with market forces perhaps, or with regard to corporate obligations and goals – meeting liabilities as and when they fall due, satisfying the expectations of investors and regulators and, perhaps, public perception through rating agency assessments?

    Regrettably in many instances, and fatally in some cases, it is competitive market pricing which has prevailed. Underwriting standards have been sacrificed in placating the great god of market share. The pitfalls of cash flow underwriting, whether in soft markets or not, have been evident for years and are well documented by spectacular failures. Looking forward, many market observers believe there are other fundamental problems facing the industry. Consider the following, for example:

  • rating concepts are relatively unsophisticated. Rates themselves are driven more by market levels than by risk factors;

  • the level of knowledge of risk is poor – there is arguably a lack of adequate training and experience;

  • rate increases have not adequately reflected or matched economic and claims inflation, let alone diminishing investment returns;

  • ready access to reinsurance cover (which has in many cases also been under priced) indicative of over capacity and under-reserving by cedants, is probably on the turn; nevertheless

  • reinsurance failure, actual and potential;

  • continuing adverse loss development and experience; and

  • poor claims-reserving performance, for which there is little excuse.

    This list can be augmented by including statistical data defective in accuracy, completeness and relevance.

    Again, back to basics. It is management's responsibility properly to plan, record, control, review and report the affairs of the company. To the extent that the above-listed problems are controllable, they are undeniably the responsibility of management to address.

    Output data is only as good as its input. Accepting a risk implies the requirement in the first instance to capture details of it, and to monitor premium and claims payments. This specific information can then be analysed in the context of aggregate exposure accumulations, for example. And then loss ratios of particular portfolios can be projected, in turn feeding prescriptive information back to original underwriting decisions.

    Analysing claims developments (triangulations) facilitates claims reserving; the timing of claims payments, with its inferences for cash flow, can and should dictate investment strategy and asset management. At the outset, though, there must be corporate/management objectives to decide underwriting policy, lines of business, business mix, solvency targets, rating guides and the like. There must be plans – projections of premiums and claims – and then systems can be applied to monitor performance against prescription.

    But is management receiving and making enough of the information it has available? Possibly not. Is it drawing the right conclusions? Possibly not. The insurance industry is dynamic, and therefore permanently in a state of disequilibrium. This means that the management process needs also to be dynamic if corporate objectives and long-term financial well-being are to be achieved.

    Three tiers
    Information risk is a business risk. Indeed, it is arguably the business or operational risk. Surrounding all IT processes must be control systems that maintain security and integrity. In insurance at least, one can analyse these controls on three levels.

  • On the first tier, the database system must be capable of capturing and recording all data that is required to write the business and account for it. This tier is the core business administration platform, handling data, the database and application functions.

  • On the second tier, data must be capable of analysis so that it can be broken down to its ultimate components. The tools available today promise analytical power on a scale only dreamed of till now. Companies using such products are likely to be significantly advantaged, having access to and the capability to sort data in virtually any shape or format, and then reporting the processed information to each operating division. This tier is the method of critically appraising data available in the first tier.

  • On the third tier, there is forecasting. Again, tools are available to assist with the rating process, reinsurance placing, run-off analysis – indeed, any form of financial planning. These tools use statistical techniques to determine probabilities for ultimate outcomes of future events. This tier is predictive of future business scenarios based on data from the second tier. But the flow of information is not always from the first tier to the second and then to the third: output from the third tier can feed back to the second, helping to refine the information analysis, and output from the information analysis can feed back to the first tier, helping to identify any potential need for additional functionality and data capture in the core business system, for example.

    This three-tier approach is a methodology for understanding the IT environment. It provides a measurement of risk inherent within an IT strategy and the application of that strategy, and it provides a framework for effective business control and IT risk management. The methodology is fresh and it is radical, but it is not revolutionary: it is beautifully simple in its clarity, and it is comprehensive – providing an integrated whole. And the whole is greater than the sum of the parts.

    On each tier, risk exposure can be readily assessed.

  • On the first tier – containing core applications, the network, databases and operating systems – the risks include system security, processing integrity, process efficiency and the appropriateness to the organisation's data/function/reporting needs.

  • On the second tier – with standard reporting, ad hoc reporting, queries and trend analysis – the risks include the adequacy of data capture, the suitability of data for analysis, and its accuracy, completeness and timeliness.

  • On the third tier – for prediction, monitoring, analysis and feedback – the risks include scenario validity, the suitability of underlying data, failure to relate actual results against projections, and failure to assess system feedback requirements.

    And for each risk, on each tier, there are identifiable solutions, as shown in the bottom diagram on the previous page.

    Insurers are going through a rough time at the moment. In the immediate wake of a disaster, investors understandably panic. But recent history shows that following a major earthquake, for example, or a massive flood, the fortunes of well-funded and well-founded insurers and reinsurers recover, as reflected in the share price of listed underwriters. The demand for insurance rises in accordance with the perception that the world is becoming a more dangerous place.

    Then, insurers and reinsurers are in a position to charge the rates that the risk warrants, rather than the ones which competition dictates. Today, the situation is helped by the fact that regulators are taking an increasingly tight grip on insurance markets worldwide, with the result that there is less excess capacity floating around.

    In the aftermath of a disaster, the insurance market is as close as it ever gets to providing a technically sound and well-researched product, selling for a competitive price. But history shows that the market quickly forgets the significance of this, and sooner rather than later resumes underwriting to tight margins without a true perspective of actual and accumulated risk. So there is a need for a more scientific approach.

    IT is central to that. IT strategies should be kept constantly under review. Recent world events show the value of doing just that. Indeed, it is now imperative. The market will continue to become more risky, and will ignore IT at its peril.