Tim Beauchamp argues that technology can provide management with a more scientific approach to underwriting.
Arguably, there is now – or there was until September 11 – a convergence of three adverse factors in insurance markets:
The untimely convergence of these three factors is likely to have a decisive impact on the financial well-being of insurers and reinsurers. In turn, this prompts consideration of a range of issues but, most importantly, consideration of the role of IT.
But let us begin with first principles. Underwriters write risks, and accept premium for doing so; out of the fund of premiums their underwriting generates, they pay claims. But which is the chicken, and which the egg? Which comes first: rating the business – in line with market forces perhaps, or with regard to corporate obligations and goals – meeting liabilities as and when they fall due, satisfying the expectations of investors and regulators and, perhaps, public perception through rating agency assessments?
Regrettably in many instances, and fatally in some cases, it is competitive market pricing which has prevailed. Underwriting standards have been sacrificed in placating the great god of market share. The pitfalls of cash flow underwriting, whether in soft markets or not, have been evident for years and are well documented by spectacular failures. Looking forward, many market observers believe there are other fundamental problems facing the industry. Consider the following, for example:
This list can be augmented by including statistical data defective in accuracy, completeness and relevance.
Again, back to basics. It is management's responsibility properly to plan, record, control, review and report the affairs of the company. To the extent that the above-listed problems are controllable, they are undeniably the responsibility of management to address.
Output data is only as good as its input. Accepting a risk implies the requirement in the first instance to capture details of it, and to monitor premium and claims payments. This specific information can then be analysed in the context of aggregate exposure accumulations, for example. And then loss ratios of particular portfolios can be projected, in turn feeding prescriptive information back to original underwriting decisions.
Analysing claims developments (triangulations) facilitates claims reserving; the timing of claims payments, with its inferences for cash flow, can and should dictate investment strategy and asset management. At the outset, though, there must be corporate/management objectives to decide underwriting policy, lines of business, business mix, solvency targets, rating guides and the like. There must be plans – projections of premiums and claims – and then systems can be applied to monitor performance against prescription.
But is management receiving and making enough of the information it has available? Possibly not. Is it drawing the right conclusions? Possibly not. The insurance industry is dynamic, and therefore permanently in a state of disequilibrium. This means that the management process needs also to be dynamic if corporate objectives and long-term financial well-being are to be achieved.
Information risk is a business risk. Indeed, it is arguably the business or operational risk. Surrounding all IT processes must be control systems that maintain security and integrity. In insurance at least, one can analyse these controls on three levels.
This three-tier approach is a methodology for understanding the IT environment. It provides a measurement of risk inherent within an IT strategy and the application of that strategy, and it provides a framework for effective business control and IT risk management. The methodology is fresh and it is radical, but it is not revolutionary: it is beautifully simple in its clarity, and it is comprehensive – providing an integrated whole. And the whole is greater than the sum of the parts.
On each tier, risk exposure can be readily assessed.
And for each risk, on each tier, there are identifiable solutions, as shown in the bottom diagram on the previous page.
Insurers are going through a rough time at the moment. In the immediate wake of a disaster, investors understandably panic. But recent history shows that following a major earthquake, for example, or a massive flood, the fortunes of well-funded and well-founded insurers and reinsurers recover, as reflected in the share price of listed underwriters. The demand for insurance rises in accordance with the perception that the world is becoming a more dangerous place.
Then, insurers and reinsurers are in a position to charge the rates that the risk warrants, rather than the ones which competition dictates. Today, the situation is helped by the fact that regulators are taking an increasingly tight grip on insurance markets worldwide, with the result that there is less excess capacity floating around.
In the aftermath of a disaster, the insurance market is as close as it ever gets to providing a technically sound and well-researched product, selling for a competitive price. But history shows that the market quickly forgets the significance of this, and sooner rather than later resumes underwriting to tight margins without a true perspective of actual and accumulated risk. So there is a need for a more scientific approach.
IT is central to that. IT strategies should be kept constantly under review. Recent world events show the value of doing just that. Indeed, it is now imperative. The market will continue to become more risky, and will ignore IT at its peril.