The winning paper in the Lumina Awards ‘risk modelling' category, by Oliver Peterken, Nigel Davis, Matthew Foote, Karl Jones and Shigeko Tabuchi outlines Turkey's national catastrophe risk management system.

Overview
The development of a national catastrophe risk management system for the Government of Turkey used innovative applications of state-of-the-art risk modelling methodologies to produce a comprehensive risk analysis and management system at a national level and specific to the Turkish environment. Data relating to the national building stock and new loss vulnerability functions were developed specifically for the project, as was a completely new probabilistic earthquake model including the use of the spectral displacement technique rather then the Modified Mercalli Intensity (MMI) to produce accurate estimates of damage from each modelled event. A bespoke catastrophe risk management system, called TCMS, was also developed which provided intranet-distributed, browser-based modelling and reporting functions in Turkish. The system has been successfully installed in the Turkish government offices.

Turkey is situated in one of the most seismically active regions of the world with a large part of the country at significant risk of major catastrophe. Historically, the Government of Turkey has had a legal liability to fund the costs of reconstructing buildings after an earthquake. The government's exposure to catastrophe risk has significant adverse implications for its budget, financing and its anti-inflationary targets. Thus in 1999, the government launched an ambitious project to tackle this national catastrophe risk by firstly privatising the risk through offering insurance via the Turkish Catastrophe Insurance Pool (TCIP) and then exporting large parts of the risk to the world's reinsurance markets.

Funded by the World Bank, this programme became part of a larger initiative known as the Turkish Emergency Flood and Earthquake Recovery Programme (TEFER). Success relied on the ability of the Turkish government to accurately quantify and manage the catastrophe risk. To this end, a state-of-the-art catastrophe risk modelling system was constructed with the following objectives:

  • to provide a comprehensive earthquake hazard model based on the most up-to-date scientific research and methodologies, ensuring that all necessary data and hazard model components were applicable to the Turkish environment and accepted within the scientific community;
  • to provide the results of the modelling process at a consistent level across the whole of Turkey;
  • to deliver a full suite of catastrophe risk management tools, distributed to key personnel within the Government of Turkey and their appointed representatives and tailored to the specific needs of the TEFER programme, therefore maximising the project's benefits to the government. The system needed to be robust, distributed to all authorised users across the government's intranet regardless of location, flexible in application and built to allow non-specialist users full functional access without the need for extensive training and on-site maintenance.

    Achievement of these objectives and delivery within the limited project timeframe required an innovative and holistic approach to the loss modelling methodology and data collection processes as well as the construction of the risk management system. A key requirement throughout the project was to ensure that the requirements of the end-users were incorporated at all stages of the development process.

    The risk modelling system took account of all new methodologies and research where appropriate, including paradigm shifts in earthquake research such as the use of spectral displacement rather than the outmoded MMI to determine damage potential. Significant resources were used in the creation of new data and functions specific to the Turkish situation to enable successful application of the new techniques at a national scale. At all stages, validity of the approach taken was ensured through rigorous academic scrutiny.

    Delivery of the system took advantage of browser-based software development tools and procedures to make certain that user requirements were met, including the development of new map – and graph-based reporting functions and to ensure that the results of the model were communicated to non-specialist users in their own language.

    To the best of our knowledge, TCMS is the first successful application of the latest earthquake modelling techniques at a national level and the first to apply them to a non-US environment. The rest of this paper will outline the main areas of innovation which led to the successful development and delivery of the TCMS system to the Turkish Government.

    State-of-the-art earthquake loss modelling for TCMS
    The creation of the loss modelling system for TCMS required an innovative approach to many of the core technical areas of the modelling process. New methodologies based on recent advances in US earthquake hazard modelling and research provide a nationally applicable standardised earthquake loss estimation methodology (FEMA, 2000). These were adopted where they were proven to be applicable to the Turkish context and where they would provide a real technical advantage to the Government of Turkey's risk management strategy. Whilst previously developed earthquake loss estimation techniques have often had the primary aim of estimating the cost of rebuilding after damage, the new methodology estimates the actual physical damage to buildings and is therefore potentially more accurate in estimating losses.

    The following sections deal with the main methodological concepts:

  • loss modelling, including the development of a new seismic catalogue which provided the optimal number of events for a national probabilistic earthquake model. Also, the provision of user-defined deterministic earthquake event modelling;
  • hazard data, including the development of new seismic source zones and the use of rupture plane projections to model event characteristics in three dimensions rather than as simplistic point or line ruptures;
  • vulnerability functions, including the creation of new vulnerability assessments specific to the Turkish environment and the development of new, calibrated vulnerability functions based on the US approach for 15 Turkish construction types; and
  • portfolio data, including the creation of a detailed national Turkish building stock database which was used in conjunction with the new vulnerability curves and spectral displacement intensity values.

    Loss modelling
    A synthetic earthquake catalogue of credible earthquake events was created for the loss estimation model in order to determine national economic losses and potential losses to TCIP and for estimating the annual losses for any given location in Turkey. An event-based procedure was required in order to obtain the shape of the loss exceedance curve for designing the insurance scheme. A loss exceedance curve shows, for a given monetary amount, the probability of an earthquake event causing at least one loss of at least that amount. Individual earthquakes and their effects on buildings were modelled. This method required the generation of a large number of earthquake scenarios, each with an annual probability of occurrence.

    The epicentre locations for the catalogue events are shown in figure 1. An optimal number of events were included to ensure accurate results and to maximise model processing efficiency. The number of scenarios in the catalogue was derived by various factors, including the size and location of seismic source zones, magnitude range, spatial distribution of populated areas and to optimise the computational time so that a national level loss estimation model could be processed within a reasonable timescale. In order to ensure that high population areas were captured by the areas of strong shaking due to the events, some (large) zones were split into several sub-zones.

    Re-runs of historical events or hypothetical events are also possible. TCMS allows users to carry out probabilistic loss assessment using the synthetic catalogue, or to create their own scenario and estimate losses incurring from it by using the specially-developed web-enabled earthquake event generator, which allows the user to define an event through a simple map-based interface and to set the relevant event parameters, including magnitude and rupture orientation, and to then save the resulting spectral displacement values as intensities for later loss estimation modelling.

    Hazard data – use of response spectra
    A database of earthquake events that represents the temporal and spatial distribution of seismic activity in the region is the most important and fundamental input into an earthquake hazard model. Together with (neo) tectonic data and other scientific findings, seismic source zones are developed which allow generation of stochastic earthquake events with an annual probability of occurrence. New seismic source zones were developed for TEFER, and in modelling ground shaking, a projected fault rupture plane was used as a model of each earthquake event source. This is rather than, for example, a point or line source which could underestimate the distance between the source and the site of interest and which as a consequence may underestimate the ground motion values and hence the damage. For each event of given magnitude, slip type and dip angle, surface fault rupture length and width are calculated to generate a planar source projected onto the ‘surface' as an areal zone rather then as a point or line. Hazard intensities are calculated at the site of interest (for instance, a specific administrative area or other location) and modified to reflect site conditions and structural types where data are available.

    The new methodology of displacement spectra was used for TCMS earthquake hazard intensity modelling. TEFER is, to the best of our knowledge, the first successful example of this methodology being applied outside the US. Although the use of response spectra as hazard intensity requires building specific vulnerability functions and portfolio data, and has a potential to increase calculation time, it improves the accuracy of damage estimation of different building types and the distribution of estimated losses. The relevant data collected to enable the use of displacement spectra are described in the following sections.

    One of the key factors used to distinguish different models assessing losses from earthquake events is the parameter employed to translate ground shaking into building damage. The majority of damage caused by earthquakes, particularly to buildings, can be directly attributed to the effects of ground shaking induced by the passage of seismic waves. The estimation of the ground shaking expected at each location is therefore fundamental to calculation of the resulting losses. The use of macroseismic intensity scales or indices such as MMI has been favoured until recently. Intensity is a natural choice for loss modelling application because it is directly related, by definition, to levels of damage in different categories of building. There are also extensive databases of building damage and corresponding intensities which permit the derivation of empirical damage functions. There are, however, many shortcomings in the use of intensity in models to estimate the losses due to future earthquakes. For example, loss modelling on the basis of intensity requires the use of attenuation equations that predict arithmetical values of intensity as if it were a continuous variable, whereas intensity values are discrete indices with non-uniform intervals. Furthermore, the use of intensity ignores entirely the relationship between the frequency content of the ground motion and the dominant period of buildings. It is also difficult to apply to modified or new building types.

    The use of quantitative ground motion parameters such as peak ground acceleration has become more favoured in recent years. The level of damage is related both to the total energy in the ground motion and the rate at which this energy is imparted on structures. Peak ground velocity is also used in some models, and although it is related to the energy in the motion, it ignores the importance of the frequency content of the motion. The frequency component can only be accounted by the use of response spectral ordinates, and hence the current movement is towards the use of response spectra and building capacity curves to estimate the behaviour of the building itself. The use of displacement spectra in particular has gained increasing recognition in determining damage levels due to earthquake shaking and as a result were used for TEFER.

    Vulnerability function
    Vulnerability functions relate hazard level to damage. The use of observed vulnerability and calculated vulnerability are two general vulnerability estimation approaches used in hazard modelling. The observed vulnerability is based on past damage to actual building stock in an area and a given intensity, and usually expressed in macroseismic scales. Actual claims data are often used to develop this function and to account for the uniqueness of individual portfolios. It is simple in concept and application to loss estimation and favoured by insurance underwriters and some loss modellers. This approach, however, provides no real modelling of interaction between ground motion, site condition (i.e. soil) and structure response, and it does not fit with current engineering parameters of ground motion. Intensity measurement is difficult when the building stock is dynamic and also difficult to apply to new or modified building types, all of which were of importance in the Turkish situation.

    The second approach of calculated vulnerability is based on the calculated performance of different building types, and is used by most earthquake engineers and some insurance loss modellers. It relates to engineering ground motion and avoids the use of macroseismic intensity. This approach allows modelling of building types not previously damaged or with no damage records, such as retrofitted buildings. Influence of ground condition and interaction between ground motion and structural response can be incorporated by using this approach. In the new methodology, damage is determined by spectral displacement of buildings in response to earthquake, and damage distribution is based on fragility curves which incorporate variation of seismic performance within building types. This approach, however, is difficult to apply to complex structures and many assumptions have to be made to allow for uncertainties. It is also invalid for buildings which fail in non-structural ways. This approach is also not based on damage data.

    To overcome limitations of the second approach, a new, third, approach based on damage-calibrated calculated vulnerability was introduced for TEFER. The building – specific vulnerability functions for Turkey were developed by Cambridge Architectural Research Ltd and incorporated 20 years' experience of Turkish earthquake vulnerability assessment. The basis of this approach is the same as for calculated vulnerability and takes advantage of the spectral displacement technique but avoids key limitations of the generalised approach by calibrating key parameters to match the loss distribution from observed vulnerability and also recent local damage data.

    Portfolio data
    This dataset consists of an aggregated insured value and number of policies in each location for a given line of business, and additional information such as structural type and occupancy types could be added to allow more detailed loss assessments. The data is provided by the model user, and the system developed for TCMS allows the visualisation of the portfolio data once it is loaded and validated in the loss model. It is a very powerful tool, allowing the user to identify the distribution of its risk on-line.

    The use of response spectra required portfolio data to contain building type information, and it is often difficult to capture all the necessary information for the purpose of portfolio loss estimation. Turkish buildings were classified into a set of building types according to their known seismic performance. As part of TEFER, a national residential building stock database was compiled to forecast the potential distribution of policies and to model losses to TCIP in order to identify high risk areas. Non-residential building stock data was also compiled to allow economic loss estimation. The most up-to-date census data and annual construction statistics from the State Institute of Statistics in Ankara and more detailed data from local authorities were used to estimate the number of buildings and distribution of different building types. The total reconstruction costs were derived from recent construction cost and average floor areas and number of storeys for each building type. The national building stock data is very variable, and it could assist greatly in assigning building types to portfolio data where such information is not available and to assess potential coverage of the TCIP.

    The development of the earthquake loss modelling methodology and associated datasets was an important and resource-intensive process, and successfully delivered a state-of-the-art risk assessment to the Government of Turkey. The results of the risk modelling exercise also formed an integral component of the economic and risk transfer elements of the completed TEFER project and the whole process required considerable project management expertise and management of diverse academic and technical experts, both in Turkey and the UK.

    Additionally, in order to allow the results of the research to be applied in the management of Turkish catastrophe risk, it was essential that a sophisticated yet intuitive catastrophe risk management system was developed to allow the maximum benefit to be extracted by the government officials from the whole TEFER programme. The main objectives of the catastrophe risk management system were to provide the users with a browser-based distributed risk management system operating across the government intranet which enabled core loss estimation and risk analysis functions to be carried out by non-specialist but authorised users in the Turkish language. The core functions included:

  • the ability to load portfolios based on new building stock surveys or TCIP penetration;
  • the calculation of loss estimates, annual average losses and loss exceedence curves for the probabilistic earthquake model created using the new methodology; and
  • the generation of user-defined earthquake scenarios for analysis as deterministic events.

    It was also important that the system allowed the user to export the results of the modelling process for import into other non-TCMS software such as the risk transfer system and that the results could be reported in a format which was clear to non-specialists.

    In order to achieve these objectives, the development process followed new methods in creating web-enabled technologies. Innovations such as the development of the bespoke mapping and graphing interfaces as well as the supply of a system in the Turkish language were essential to its success. Traditional approaches to catastrophe software development were discounted due to the specific requirements for the TCMS system.

    TCMS System development
    Traditional approach – ‘stand alone' and ‘fat client' catastrophe risk systems
    In the past, catastrophe modelling software has been designed and implemented as either ‘stand alone', where all software, user interfaces and data are stored on the computer running the application, or ‘fat client' systems which implement a ‘client-server' architecture where software is loaded on individual PCs but data are held on a centralised database.

    Such systems are created and tailored for a particular type of computer operating system and are generally distributed with a licensing agreement stipulating the number of machines that the software may be installed on. This type of software, while an established method for delivering catastrophe modelling technology, can provide the user with a number of technical, resource and logistical problems.

    For use within organisations, these software packages must be integrated within existing business systems and also must be compatible with current information technology (IT) and system strategies. This is a time and resource-consuming process which may cause delays and potentially limit the use of the software across the user group with the following potential issues:

  • difficult system administration. IT departments must install, maintain and upgrade application software on every single client machine. This is particularly time-consuming for software that is constantly upgraded;
  • as all of the application software is held on the client machines, there is a security risk from users who are able to ‘hack' in to the code;
  • all applications may be tied to a particular operating system or version of an operating system. This means that applications must be developed and implemented specifically for each type of client operating system (e.g. Windows 95, Windows NT, Apple Mac, etc.); and
  • each application maintains a direct database connection and as databases often limit the number of concurrent users, this can therefore restrict the number of users that can run their application at the same time. Network connections may also affect system performance.

    With these potential problems and difficulties affecting the implementation and usage of the systems at the user level, a different approach to model delivery becomes more advantageous.

    ‘Thin client' web-based catastrophe modelling
    Web-enabled technologies have advanced in sophistication and reliability enough over the last two years to allow large data servers and associated systems to be available for authorised client access at acceptable speeds. The main advantage to users of such systems is the ‘thin client' nature of the system. This allows overheads in system maintenance and PC specification to be minimised thus enabling resources to be targeted at key hardware and sites. Intranet (applications operating within a secure network environment) and extranet are now established environments for providing distributed applications that enable users to access and databases and interrogate their data, and TCMS utilises this approach to software delivery.

    TCMS and ‘three-tier' architecture
    The TCMS system adopts ‘three-tier' architecture (figure 2). The structure assumes a ‘client-tier', ‘application-tier' and ‘database-tier'. Database and software producers such as Oracle have implemented this approach to developing distributed database applications and provide tools for developing each tier of the system. TCMS uses a relational database to deliver data to the client PC from the loss estimation database application. The following sections describe the basic principles behind the loss estimation system.

    Client-tier
    The framework for providing working applications to the client machine is the web browser, devised as a means of viewing files located on other servers via the World Wide Web. Their usage is not specific to the internet and can therefore be used as tools for viewing files over other networks such as corporate intranets or extranets. Typical web browsers such as Netscape Navigator and Microsoft Internet Explorer allow the user to download a variety of files from the application tier (referred to as ‘middle-tier' in figure 2). The TEFER system allows browsers to download graphical user interface (GUI) components (presentation logic from the middle-tier in the form of HTML pages containing small, embedded programs called Java Applets). These applications sit within the HTML pages and are compiled and executed by the browser at the time of download. The Java GUI (called a ‘form') comprises buttons, lists, tables and various other objects which provide a visual representation of the database data. Figure 3 shows one of the forms used in the TEFER application.

    Application-tier
    The middle-tier/application-tier comprises forms servers, reports servers and a web server that interact to deliver the presentation logic to the client-tier and also to execute application components (business logic) on the application-tier. The business logic handles and processes requests for database data from the client-tier. With the application residing on this tier, there is no requirement on the client PC to do any more than render the GUI components within the browser as all calculations and processes are performed here. While this kind of application provision is fairly common in the IT industry, in the field of catastrophe modelling it is a new approach. In addition to the benefits of this delivery over ‘fat client' and ‘stand alone' systems highlighted earlier, the system is very flexible and applications can be scaled according to system usage. It is common for web applications to incorporate ‘web farms' when user concurrency becomes high and therefore requests for web pages, forms and reports can be shared among a few or many servers as in figure 4.

    Database-tier
    This tier contains the Relational Database Management System (RDBMS) that holds all of the tables and database objects required by the TEFER system. RDBMS is comprised not only of tables of relational data, but also many objects and processes that can enhance the functionality and performance of the database. Suitable databases must be able to handle problems of concurrency where many users attempt to select or change data in the same table. RDBMS handle this situation with a range of database features. The database contains various monitors that interact to maintain data integrity. If rows in a table are being utilised by one user, monitors are able to able to manage this process while checking that other concurrent requests do not attempt to access or alter this data at the same time. Such databases are also able to define buffers that store data updated in recent transactions. These buffers allow the database to undo (rollback) alterations to the data. These features may be considered essential in order to operate a central data source where users may be running applications at the same time (e.g. during a renewal period).

    TCMS
    The TCMS three-tier architecture allows other web applications to be developed and integrated in to the same website as the database applications. The mapping and graphing interface for TCMS is developed using Java technologies for the display of data in the RDBMS. Web mapping has previously been very restrictive in functionality because reasonable amounts of RAM and fast processors are required to download, compile and run applications within browsers. To solve this problem, mapping has typically been performed on the server and the area of interest is posted to the client PC in the form of an image. The functionality on the client end is simply pan and zoom which removes any map interaction. Lowering costs and rapidly improving specifications of client PCs has meant that mapping can now become interactive on the client PC and the TCMS solution required a unique innovation to provide this.

    The TCMS software is designed using a library of pure Java classes that provide the programmer with the functionality to develop customised web-mapping software. This type of interface provides a spatial perspective to the database data. A suite of functions have been developed for use with TCMS:

  • display of client portfolio data;
  • display of hazard event scenario data;
  • display of modelled result data; and
  • interactive earthquake generator and database data upload function.

    The above modules allow TCMS to deliver value-added data display and the ability to visually create new hazard scenarios for use within the database. The page layout was designed to provide as much information as possible on one screen, and to ensure that all relevant data were printed onto one sheet of A4 paper. Selection of relevant database parameters is achieved via a series of list box and radio button controls, linking to the database. Figure 5 provides an example of the TCMS portfolio mapping system.

    TCMS earthquake generator
    The earthquake generator function, as shown in figure 6, provides an innovative method of inserting data into the database from the client-tier. The generator implements the earthquake modelling approach already mentioned. The software allows the user to select a point on the map to represent an earthquake epicentre which then generates intensity values for each geocode affected for an earthquake of a defined magnitude at that point.

    The software visually displays the results and then asks the user whether the resulting values should be inserted into the database as a new scenario. The rigorous transactional nature of the database ensures that the insert process effectively locks out access to the tables to other users once an SQL insert statement has been received by the database until the process has been completed. This ensures that data inserts to the database are managed in a sequential manner by queuing each insert process and therefore preserving data integrity.

    Client-sensitive delivery
    In order to provide globally distributed systems, there will naturally be requirements to deliver applications in a variety of languages and to integrate with existing client organisation systems. TCMS can be integrated with existing systems such as intranets because the system can be accessed from a simple HTML hotlink within the client system.

    Support for different languages introduces different characters and different sets of characters that must be incorporated in to the application on the client machine. Two-tier architectures cannot separate presentation logic from business logic (i.e. separation of GUI software components from the data processing and calculation components) because all of these components are wrapped up in the same software code. TCMS avoids this because the presentation logic in TCMS can be defined in order to instruct the GUI (i.e. Oracle forms or map and graph applets) components to render themselves according to the local language settings and characters (the locale) of the client machine.

    If the presentation logic is treated as a separate entity from the business logic, applications can respond to changes in languages and characters while using the same data calculations and functions. Java Applets are able to detect the location of the client machine from the Java Virtual Machine (JVM) present in the web browser. If the locale settings of the client machine can be determined, the software can respond and deliver the GUI components in the appropriate language (e.g. a ‘yes' button in Britain or a ‘oui' button in France). All environment settings are therefore controlled by the web browser, which in turn inherits its environment settings from the operating system, locale and associated character sets. This also means that systems can be delivered to a client machine irrespective of the client operating system. TCMS has successfully delivered its modelling software in the Turkish and English language environments using this software technology.

    Conclusions
    The TEFER project has successfully delivered a state-of-the-art catastrophe risk management system to the Government of Turkey. This required the innovative application of new research and techniques to the problem of Turkish earthquake risk and is, as far as we are aware, the first successful use of the spectral displacement method at a national level and outside the US. The project required new data and methodologies to be produced at a consistent scale across the whole of Turkey to enable the advantages of the new earthquake estimation process to be realised.

    The provision of an intranet-distributed, browser-based catastrophe risk modelling system (TCMS) has allowed the Government of Turkey to make best use of the research undertaken through the use of simple yet rigorous data processing, modelling and risk analysis tools. The development of mapping and graphing software using Java development tools, as well as the provision of the software in the Turkish language, ensures that the results of the modelling and risk analysis process are easily understood by non-specialist users. It is hoped that the use of the system may help to mitigate and manage the significant risk faced by the Turkish people from earthquake in the future.

    References
    FEMA (2000). FEMA 366: HAZUS99 estimated earthquake losses for the United States. Federal Emergency Management Agency, Washington D.C.