Design lessons from medical treatment disaster

A recent article in the New York Times (“The Mark of an Overdose”, by Walt Bogdanich, 8/1/10, http://www.nytimes.com/2010/08/01/health/01radiation.html?pagewanted=all) discusses dangers to patients from CT brain scans. In many cases (and there are shocking pictures of the damage in the article), patients who had CT scans to tests for a stroke in some cases received an overdose (up to 13 times the usual amount for the test) which resulted in striking hair loss, as well as headaches and memory-related symptoms. As stated in the article, the review by the NYT found that “While in some cases technicians did not know how to properly administer the test, interviews with hospital officials and a review of public records raise new questions about the role of manufacturers, including how well they design their software and equipment and train those who use them.” The author found that during application of the test in a way which would provide better images of blood flow, technicians used an automatic feature of the equipment which they thought would lower radiation levels, but in fact raised them. While this excellent article provides a much more in depth analysis of the problem (and you should read it), I wish to focus on a particular aspect related to engineering design and what can be learned about it.

What is the responsibility of an engineer and their company in designing a system so as to avoid possibly disasterous user error? We can site many examples where this becomes a critical issue — other medical examples like the title failure of the cancer treatment equipment in the book “Set Phasers on Stun” (listed on the bibliography page) and non-medical failures such as the one that played a contributing role in the Bhopal/Union Carbide chemical plant disaster. In that case, negligence due to inadequately trained operators played a critical role.

As we, as engineers and designers of powerful equipment and systems, study these failures, these case studies emphasize the need for manufacturers to provide comprehensive and clear instructions, and to design “fail-safe” features and limiting functions into equipment to automatically prevent dangerous situations from occurring. Of course, this can never be accomplished completely. While the light-hearted phase goes “build an idiot-proof system, and someone will build a better idiot”, and this is a deadly serious situation, it does speak to the truth that it is a responsibility of the conscientious engineer to do their best to try to anticipate all uses, and misuses, of the machines, systems and processes they design.

Leave a Reply

Your email address will not be published. Required fields are marked *