Dynamic Financial Analysis (DFA) has been the “buzz” of the reinsurance asset/liability modeling community for the past few years. The huge market momentum in DFA applications has been focused primarily on helping firms to compare reinsurance treaties and to simulate the financial health of a company under various asset allocations and economic scenarios. DFA has proven to be a superior technique for measuring risks and rewards in a context relevant to the day-to-day business of a reinsurance company. Even so, DFA comes with its own set of drawbacks. The sheer volume of data resulting from DFA-based modeling does not lend itself to easy interpretation. It is difficult for senior management to understand the business implications and, more importantly, the effects of their decisions on all relevant aspects of their business. With maturation, DFA will become a more intuitive technique, the results of which will lead management to make more informed decisions within their unique constraints and conditions. An “analysis of the analysis” will facilitate management decision-making.

In the “Catching the boat” article, which appeared in the March 2000 issue of Global Reinsurance, we demonstrated that DFA was the desired solution for gaining a better understanding of enterprise risk arising from the asset allocation decision. DFA is a financial modeling process that examines the effects of various investment strategies using the variability of and interaction among assets, liabilities, capital and external factors such as economic cycles, corporate taxation and insurance regulation. When modeled correctly, the company has a representative set of balance sheets, income statements, cash flow statements, financial ratios and numerous other financial data. Most DFA models in the marketplace output a standard set of variables. Our licensed DFA program is capable of projecting over 650 different financial variables, including financial parameters, IRIS ratios and other relevant statistics. Measures of risk and reward can be computed as a mathematical combination of one or more of these financial variables.

Beyond DFA, we use an optimizer to obtain an efficient set of asset allocations, based on the financial condition the company wishes to achieve (their “objective function”). Using the optimized data files, we can calculate risk and reward measures, for example, probability of ruin, contributions (premiums) from shareholder-insureds, economic value of surplus, terminal assets, and so on, and plot them on an efficient frontier. Each point on the efficient frontier corresponds to an optimal asset allocation.

Figure 1 contains a plot of a sample efficient frontier and the associated asset allocation for point F. An obvious question to ask at this point: is it prudent only to look at the efficient frontier? No. If we limit our interpretations of the analysis to this pair of graphs only, we are not leveraging our usage of the DFA process. While we may optimize the investment strategy based on defined objective functions only, the modeling process generates a wealth of additional information. Having access to this additional data can enhance management's understanding of the financial projections and inform the decision-making process. There is a story behind each efficient frontier point and objective function which must be investigated further via a data analysis tool.

Before we begin defining the capabilities of a good data visualization and analysis tool, it is important to understand the nature of the results. The scenario file is stochastally designed within the modeling and drives most of the analysis. We typically use 1000 scenarios that give a consistent representation of economic cycles and asset markets. The results contain projected values for each of the economic scenarios. Hence, we have the ability to compute the average for each financial variable across all scenarios, and look at the maximums and minimums or any percentile value for all scenarios.

By evaluating a strategy across a wide range of scenarios, the decision-maker can understand the risks and opportunities for the desired strategy. Management can investigate the financial or regulatory effects of making any decision; for example, track the growth of net income or determine the company's RBC levels at the end of each year. By analyzing the entire data set, management can satisfy any other concerns that might arise. It can also determine any unintended consequences of a chosen strategy or allocation. For example, if the board of directors optimizes the allocation for the preservation of capital, but also wishes to earn a minimum total return, the directors have the ability to determine the impact of changing asset allocations on each of their objectives. Other measures commonly used that can significantly affect the choice of an investment strategy include balance sheet information, calculated cash flow from operations and determining whether surplus falls below regulatory levels for two consecutive quarters.

Using the output from the DFA process, we can use robust data visualization and analysis tools to produce tangible description of the results.

Standard visualizations that we use include tabulated balance sheet values across the modeled years, bar graphs that show the growth/decline of the median net income and surplus over time (figure 2), diagrams that depict RBC levels for all periods, and so on. Utilizing the stochastic nature of the analysis, we can look at the distribution of any financial variable across all scenarios in the form of a histogram.

We can also compute confidence intervals for any financial variable - for example - surplus, and plot it as a Tukey box plot as seen in figure 3, which is also commonly known as a “box & whiskers” graph. Of course, these representations which allow us to chart the expected outcomes and visualize the variability of the upside/downside potential, can be done for any and all of the 650 financial variables, for example reserves, assets, RBC values, paid federal income taxes (FIT), and so on. Looking at this information collectively, we can exhaustively determine the effects of choosing an investment strategy based on the company's appetite for risk.

There are, however, a few additional concerns. There is no way to ascertain ahead of time the set of financial variables that matter most to decision-makers or guarantees that the answers lie within a subset of the projected data. Moreover, there is no way to anticipate every question that might arise. “Analyzing the analysis” can take many separate sessions, lengthened by the time needed to retrieve and chart the results. Wouldn't it be useful to have the data ready and formatted as the results are being considered and discussed?

Consequently, it is essential to utilize a data analysis tool that facilitates a comprehensive understanding of the results. We are developing one such tool that collates the results from the DFA process and delivers it to the decision-makers over the internet. The user will have the ability to dynamically create, modify and analyze all of the financial data generated during the analysis (as discussed earlier). This ability to visualize and explore the DFA results interactively realizes the promise of DFA. The “analysis of the analysis” is an important innovation in the speed and usefulness of DFA-based modeling. A robust data analysis tool engenders a much better understanding of how the decision will affect the company as a going concern. It provides the decision-makers with a more tangible description of the results, leading them to make a more informed decision that takes into consideration their real-life constraints and many more potential consequences of a particular strategy.

Jayant Kumar is a portfolio strategist, Insurance Asset Management Group, at Brown Brothers Harriman, New York.