Randomness and Engineering Failure

While it is not specifically a book about engineering disaster, or failure for that matter, I do find some interesting connections in the ideas presented in a book by Leonard Mlodinow, “The Drunkard’s Walk: How Randomness Rules Our Lives”. (Mlodinow is a co author, with Stephen Hawking, of “A Briefer History of Time”, as well as a number of other very good books — see his Caltech website http://www.its.caltech.edu/~len/).

Mlodinow talks about how we view events with what is referred to as “20-20 hindsight”. If one takes an extraordinary event, like an engineering disaster, it is often fairly straightforward (after a thorough investigation) to identify what we believe are particular, logical causes. However, it is necessasry to guard against jumping to conclusions about the judgement of those who may have (in some way) caused the failure. The more epic and emotional the situation, the more likely this rush to judgement is to occur. Disasters are, by definition, both epic and emotional.

When I teach about engineering disaster, I ask students (following their analysis of a disaster) to try to shift their perspective to a working system, before the failure. This is very difficult to do, but in many cases may be the only way to understand factors that affect (or cloud) the judgement of engineers, operators and others who play a key role in the eventual failure of the system. Mlodinow talks about this in terms of a concept referred to as “availability bias” in reconstructing the past. This concept is described succinctly on wisegeek.com thus:
“Availability bias is a human cognitive bias that causes us to overestimate probabilities of events associated with memorable or vivid occurrences. Because memorable events are further magnified by coverage in the media, the bias is compounded on the society level. Two prominent examples would be estimations of the probability of plane accidents, and the abduction of children. Both events are quite rare, but the vast majority of the population wildly overestimates their probability, and behaves accordingly. They are falling prey to the availability bias…”
(http://www.wisegeek.com/what-is-availability-bias.htm).

Hence “availability bias” is a very important concept to consider when teaching (and learning) about engineering disaster.

Probability and uncertainty (the primary theme of Mlodinow’s book) of events is a central component of the engineering methodology of “Design for Reliability”. Design for Reliability (or DfR) is really decision making with uncertainty. I will write more about this concept in a future post (and add some links to it in the list of links on this blog page), but I think it would be impossible to teach DfR without first teaching about the concepts and math of probability. We, as engineers, never know for sure how a system might fail. We can only work with likelihood, the relative importance of failure of individual components of a system, and the severity of consequences of failure of a particular component to the performance of the overall system. This is why DfR must use structured approaches (to reduce uncertainty) as well as the intuition of engineers. While intution is extremely valuable, it must be applied within the context of structure (as I have stated in previous posts) to avoid being clouded by emotion, environment, and biases (including, of course, “availability bias”).

4 thoughts on “Randomness and Engineering Failure

  1. Senorita

    Contact us for assignment help London from native British essay writers. We offer academic writing services e.g. #essays, #theses, case studies, #dissertations, research papers, lab reports, and #assignment help in #London at wallet-friendly prices.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *