Monthly Archives: August 2010

Aging infrastructure and the Long Island Railroad

For those of you looking for interesting teaching opportunities in recent engineering failures (and with the new academic year starting, this may be on your mind), a good area to consider are reports of failure in aging infrastructure. Bridges, gas pipelines, electrical transmission systems — examples (unfortunately) abound, and can often be found in your local area. In fact, I occassionally conduct an “un-nature walk” with my students during which I point out not only how engineers design buildings, roads, street lights, etc., but also how infrastructure fails. From cracked sidewalks to rust from rebar coming through walls, examples are everywhere.

An interesting example, with teaching potential, in my neck of the woods, is the recent failure (due to a fire in a century-old switch room) of the Long Island Railroad’s venerable Jamaica hub. The New York Times has covered this nicely (see http://www.nytimes.com/2010/08/24/nyregion/24lirr.html?_r=1&scp=1&sq=long%20island%20transit&st=cse for example), and there is plenty of additional information available. Of course, a major problem with replacing aging infrastructure is cost, which presents a difficulty for engineers and politicians alike. Where, and what, do you repair or replace first? How do you create an effective maintenance plan, taking into account constraints in time, labor, money, etc.? Hopefully, we won’t always have to wait for a failure (or a disaster in terms of lives or costs) to provide guidance.

Good luck in the new academic year, and please write back with your thoughts on learning from disaster!

New York Times wins award for series on problems with medical radiation treatments.

A few posts ago, I wrote about a recent New York Times article on accidental oversdoses of radiation received by some patients undergoing an intensive form of CAT scans for detection of strokes. This article is part of an investigative series in the Times which has now been recognized with a Public Service Award from the Associated Press Managing Editors association. The full article about this award can be found at http://www.google.com/hostednews/ap/article/ALeqM5jTEYAaTLag_kgedHrdnxv0QHbSAgD9HQLMIO0.

I had read some of the previous articles in this series as well — apparently, it is a challenge to design medical treatment equipment which is easy to use and flexible enough in its use to allow variability in treatment (a good thing) while preventing accidental (or even intentional) misuse. Engineers have to pay attention to these critical engineering needs — part of an area of engineering design called ‘ergonomics’.

Ergonomics is sometimes referred to as ‘human design’ or, in other words, design for humans. Knowing how big to make a car seat, how far to put controls from a pilot, and so forth, is only part of it. Ergonomics also includes issues of aesthetics (the appearance of a product) and safety. Safety encompasses not just designing so that the product does not have hazardous components, sharp edges, etc., but also so that the product has clear instruction, and proper safety warnings and labeling. This is a critical part of design — without proper attention to ergonaomic and safety needs, misuse and errors can lead to disaster.

A new blog on disasters …

James Chiles, the author of “Inviting Disaster: Lessons from the Edge of Technology” (listed on our bibliography page) has developed a blog with his take on some recent engineering disasters. Please have a look! It is at: http://disaster-wise.blogspot.com/

Design lessons from medical treatment disaster

A recent article in the New York Times (“The Mark of an Overdose”, by Walt Bogdanich, 8/1/10, http://www.nytimes.com/2010/08/01/health/01radiation.html?pagewanted=all) discusses dangers to patients from CT brain scans. In many cases (and there are shocking pictures of the damage in the article), patients who had CT scans to tests for a stroke in some cases received an overdose (up to 13 times the usual amount for the test) which resulted in striking hair loss, as well as headaches and memory-related symptoms. As stated in the article, the review by the NYT found that “While in some cases technicians did not know how to properly administer the test, interviews with hospital officials and a review of public records raise new questions about the role of manufacturers, including how well they design their software and equipment and train those who use them.” The author found that during application of the test in a way which would provide better images of blood flow, technicians used an automatic feature of the equipment which they thought would lower radiation levels, but in fact raised them. While this excellent article provides a much more in depth analysis of the problem (and you should read it), I wish to focus on a particular aspect related to engineering design and what can be learned about it.

What is the responsibility of an engineer and their company in designing a system so as to avoid possibly disasterous user error? We can site many examples where this becomes a critical issue — other medical examples like the title failure of the cancer treatment equipment in the book “Set Phasers on Stun” (listed on the bibliography page) and non-medical failures such as the one that played a contributing role in the Bhopal/Union Carbide chemical plant disaster. In that case, negligence due to inadequately trained operators played a critical role.

As we, as engineers and designers of powerful equipment and systems, study these failures, these case studies emphasize the need for manufacturers to provide comprehensive and clear instructions, and to design “fail-safe” features and limiting functions into equipment to automatically prevent dangerous situations from occurring. Of course, this can never be accomplished completely. While the light-hearted phase goes “build an idiot-proof system, and someone will build a better idiot”, and this is a deadly serious situation, it does speak to the truth that it is a responsibility of the conscientious engineer to do their best to try to anticipate all uses, and misuses, of the machines, systems and processes they design.

The Gulf oil spill as a learning tool — information around the web

I have noticed some great information lately about how to use the Gulf oil spill in classroom and problem-based learning activities. A blog page by Eric Brunsell at ttp://www.edutopia.org/blog/oil-spill-project-based-learning-resources (Project-Based Learning and the Gulf Oil Spill) collects many of these resources nicely. There is also a great learning activity for grades 6-12 from the New York Times at http://learning.blogs.nytimes.com/2010/05/05/the-drill-on-the-spill-learning-about-the-gulf-oil-leak-in-the-lab/. I strongly recommend both if you are thinking about ways to use the Gulf oil spill in the classroom or as part of group assignments.

The Causes of Disaster

For the most part, all engineering failures can be attributed to a combination of causes: human (including ethical) failures, design flaws, materials failures, and extreme conditions. One can add pure accidents to the list, as well, but often those accidents can be, in part, attributed to the above causes with further analysis. In teaching using engineering disasters, I have found it valuable to provide background information and concepts about each area, so I plan to add a page on this blog for each area. In that way, when I (or you) find an interesting link or resource which can provide some elucidation in one of these causal areas, it will have a place to go. I will try to also have resources of this nature on my website on learning from disaster, so those of you who are educators can make use of it.