Recent years have seen a number of notable catastrophes, including two Space Shuttle disasters, the Concorde failure, two Boeing 737Max failures, and most recently, the Titan submersible failure. It is not the purpose of this essay to determine whether these accidents could have been prevented or what the cause was. It is only the purpose to determine whether the right questions were asked and what the assumptions were. Other programs are in the works, including the Boom supersonic aircraft, for which these questions are important. Other projects with possible high risks include the super-high skyscrapers described by Nova.
One reason we don’t take disasters seriously, according to Geiger, is that we forget about them. Geiger says that the 1947 Texas City disaster, with approaching 600 fatalities, is the largest disaster in history, but disasters have become so common that few people remember them.
The first question that needs to be asked is whether the system being developed, whether an aircraft or a skyscraper, included risks as part of its development plan. In the context of this essay, a system is any collection of parts with a singular purpose, most often to transport humans. The answer to this question can be found in the documents and websites describing the proposed system. The absence of the word risk in these documents and websites is a clue to the importance of risk in product development. To assume that a program has no risk violates Rule No1: All programs have risk. That assertion can be verified or countered if a program has little or no risk. A simple assertion will not do the trick. According to Hillson, “…the zero-risk project is neither possible nor desirable.”
This violation of Rule No. 1 is often reinforced with the spoken comment. “We are good engineers; we don’t have risks,” The absurdity of this statement should be apparent by now. The validity of the statement by Hillson is not that risks can be zero but rather that the designer’s expertise is so extraordinary that risks can be designed out of the project by simple effort.
So, how are risks mitigated? For super-high risks, such as Space Shuttle launches, the Columbia Accident Investigation Report provides the most conservative approach: It suggests that all launches should be approved by an Independent Technical Authority (ITA). For this approach, independent implies both organizational and financial independence. This approach would be suitable for projects such as the Titan submersible.
One aspect of catastrophes rarely discussed is the effect on the victims. It is difficult, for example, to imagine what it would be like to be hurtling into the ocean while inside an aluminum tube. Yet having tons of debris coming down on you is beyond comprehension.
Finally, there is one issue for which there is a difference of opinion among experts: how are low-risks and non-risks treated? Conrow, for example, suggests that a list of low risks should be kept and treated later if they become large. Hillson, on the other hand, suggests a more conservative approach of assuming that all systems have risks even when none is anticipated. Even in these cases, the Conrow approach would be appropriate.
________________________________________________________________________________________________________________________
Scott Jackson is a research scientist with expertise in applied systems theory. He was formerly with the University of Southern California (USC) and the Missouri University of Science and Technology (MST). He is a Fellow of the International Council on Systems Engineering (INCOSE). He has published four books on systems engineering. He is currently a consultant for both Embraer of Brazil and Comac of China, offering advice on the use of systems theory in the design of commercial aircraft.