There is a well-known comparison drawn by IT salesmen that there is more computing power in one of the current range of palm top computers than was used by NASA to put man on the moon. Whatever the accuracy of such statements, there can be no dispute that the rate of development of technology in the 30 years since then has transformed the space industry.

Insurers who have been part of the market for commercial satellites since the early days, when the major risk was that of total loss during launch, have seen many changes. In those early days, the technical complexity of satellites had not evolved to today's levels and, generally, insurers enjoyed a period of profitability. However, the market was not mature and the inevitable and predictable cyclical nature of the industry saw rates deteriorate during the period of the mid 1980s. The consequent reduction in capacity coincided with an increase in insured values. This was a direct result of the space shuttle manned flights with several spacecraft on board and the dual launch capability of the Ariane vehicle.

Growth in the sources of supply of telecommunications has been very significant during the decade of the ‘90s. There has been a consequent increase in the numbers of satellites launched, mostly due to the low orbit satellite constellations being used to achieve the operator's objectives. Experience during this period has begun to indicate that former perceptions of the risk to the integrity of satellites were not entirely correct. Contrary to earlier thinking, which saw the launch as being the principal risk, there is now evidence of a significant element of risk after the satellite has reached its intended orbit.

The insurance market has seen deterioration in its loss experience over the period of the 1990s. The performance of launch vehicles has shown consistency over the decade but, since the middle of the decade, there has been a marked change in the performance of satellites which is witnessed by large increases in losses in orbit.

An analysis of the causes of satellite failure has been carried out recently by Marcello Tarabochia of the space underwriting team at Generali Global in London. According to his study, “There has been a lack of appreciation of the ‘in orbit' part of the risk during the lifetime of a satellite. Clearly there are lessons to be learned by both insurers and manufacturers based on this analysis. Insurers will have to pay more attention to the risks associated with orbit, whilst manufacturers will need to examine quality control procedures and identify where these are failing and giving rise to the greater incidence of losses on infant and juvenile orbits.”

Generali Global's analysts looked at commercial satellites, excluding national security and scientific launches where the nature of the payload can influence the probability of failure. More than one third of some 500 spacecraft launched experienced failure. A staggering figure of more than 50% of all failures arose in the first month of orbit life. Not only is the frequency of orbital losses a surprise, but the analysis of causes also gives reason for concern.

Whilst much has been written about the increased risk arising from the growing numbers of satellites in earth orbit and the amount of “junk” from earlier decades, the analysis does not identify such risk as being a prime contributor. Of greater concern is the dense population of operating spacecraft channels and locations, which causes interference. Meteorites and the space debris resulting from solar storms and other extra terrestrial events have also attracted much comment, particularly in the popular media, but in the survey, losses suspected of arising from such causes account for a very insignificant fraction of all those recorded. An overwhelming proportion is attributed to human error. In this context, the expression “human error” should not be equated with pilot error in aviation terms. There is evidence that complex technology, combined with commercial pressure, has conspired to create the situation. Pressure to bring new products to market cuts time available for production and for thorough testing of individual components, and raises issues with regard to the integrity of design. This combination of time pressures and of budgetary constraints is becoming the major source of concern for underwriters.

As if to compound this situation, the technology is evolving so rapidly that observers are now seeing a knowledge gap between different generations of technicians.

Many experienced engineers are concerned that the training grounds of the universities and colleges are producing graduate engineers whose knowledge does not include the most recent technical developments. One would expect a knowledge gap of this nature to be addressed by employers through the provision of training but the cost involved makes training programmes an unpalatable management option. Additionally, the demands imposed upon newly qualified technicians by the rate of development can limit the range of experience to which they are subjected before taking over positions of responsibility. It follows that, if this knowledge gap is to be overcome, then manufacturers have to find a way to replace the knowledge exchange that occurred by osmosis, when technical advances were occurring at a more manageable pace.

Massimo Orsini, underwriting manager, aviation and space, Generali Global, quotes an example of hardware failure associated with battery cells. Investigation showed that there had been extensive corrosion of battery cells caused by the concentration of electrolyte used during the manufacture of the cells being too high. Effective quality control should have identified the error long before the manufacturing process was completed. One more example of a costly human error arose from a mismatch between hard wiring routes of a component established by one technician and software instruction issued by another technician working on the same project.

Events such as these beg the question as to whether there is a fundamental flaw in some of the construction processes, which permits such errors. Has the management of the manufacturing process itself become so complex that no single authority has overall responsibility for the total process?

Present day manufacturing procedures rely heavily upon IT solutions. CAD/CAM software is an essential tool in hardware design and manufacture. Self check as a means of testing software is also widely used. Increasing reliance upon these techniques requires designers and builders to ensure that the same software is not used to check the design as is used to create it. Failure to do so means that, while errors might be identified, omissions cannot, and there have been losses where the cause is suspected to lie in such an omission.

Another example of the potential for increased loss payments which our research has identified, relates to increases in the numbers of working transponders carried by present day communications satellites, jeopardising spares and basic spacecraft equipment. If there is an increase in the incidence of “human error” failures which stem from deficiencies in inspection or of quality control, then events which should be the trigger for partial loss only become the precursors of constructive total losses.

Other observers identify the prospect of difficulties arising in the context of software, and ascribe the cause to the same dichotomy of technical advance versus commercial pressure. “The speed of computers and their ability to deal with code has grown very, very fast, and our ability to produce software that is reliable and predictable has not kept pace with the growth of computers. We're launching programmes today with millions of lines of code, and our ability to manage those systems is quite weak,” says Lee Holcomb, chief information officer for NASA, quoted in Space News on 24 April 2000.

In the same report, Stephen Book, an expert on cost issues for the El Segundo Aerospace Corporation of the United States, is reported as saying that companies often underestimate the number of lines of software code needed for a job, and the amount of time needed to write the code. Mr Book also noted that the drive to cut costs has once again led to the elimination of some testing, leading to costly software failures in some programmes. “Space is just hard,” he said. “You make an error and you have lost a whole system. It costs money to test everything to the limit. When you cut testing, the philosophy is that we can save money. ‘Lets cut out unnecessary testing'. Who can oppose that? But the problem is, who determines what is unnecessary?”

Exchanging of information

Awareness of the causes of the failures helps insurers to understand the risk better. If that information can be shared, it should help manufacturers to discover the routes by which they can reduce or eliminate the incidence of such losses for the benefit of manufacturers, operators and owners, as well as insurers. But there is an obstacle.

Under the US government's International Traffic in Arms Regulations (ITAR), the exchange of data relating to satellites, including failure analysis reports, is prohibited. These restrictions inhibit the free flow of data, many believe to the detriment of the whole space industry. It is not unreasonable to say that, but for these constraints, some recurring failures might have been avoided.

As a long term and major participant in the space insurance market, Generali has shown its commitment to the exchange of information. Since 1979, we have organised and sponsored a biennial space conference. The next, the 11th, is scheduled to take place in Italy in spring, 2001. The conference will be chaired by Benito Pagnanelli, a deputy general manager of Generali and ceo of Generali Global in London, who has seen the evolution of the space insurance market from its earliest beginnings.

“This is a very important area of research, providing explanations for some of the problems which result in partial losses and constructive total losses of satellite,” he says. “Everyone involved in this business has a responsibility to work towards better results which will only come from a better understanding of the causes of loss. That is why I welcome the work that the Generali space team is doing in this field. I hope that this important topic will generate discussion at our next conference and that the publicity it attracts will assist in improving the situation.”

  • David Dunnett, head of public relations and communication at Generali Global, wrote this article in collaboration with the Generali Global space department.

  • Topics