Of checklists and tragedies

In The Checklist Manifesto (Amazon, Borders, b&n), Dr. Atul Gawande expands on his previous writing about the work of Dr. Peter Pronovost. Pronovost developed a system to help reduce complications of hospital care, such as infected venous catheters. This system has been very successful.  It is based on the idea that some tasks are simply too complex to be error-free.  Medical care has become very successful, but also very complex, to the point where one person cannot possibly remember every step in some processes, even simple steps such as scrubbing in.  

The simple and successful solution is to create checklists analogous to what pilots use.  Even though each step in preparing to fly a plane or place a central line may be simple and easy to remember, there are so many of these steps that it's easy to drop one.

I'm fortunate to live in a state and work in a hospital that is using this "Keystone" system.  Hopefully these checklists strategies will be validated for wider use.  

This idea has been gnawing at me since reading a disturbing article in today's New York Times.  The article describes what can go wrong with radiation therapy.  There are some terribly intelligent folks working in the field of radiation oncology, a field that requires the collaboration of several kinds of experts including specially trained physicians and physicists.  But, like other complex tasks, planning and executing a radiation treatment requires several steps, each of which is vital.  

The errors described in the Times article caused horrific injuries.  Often they were due to simple errors, such as failing to re-check a setting on a machine.  This seems like just the sort of error that would be amenable to a checklist system.  It's a complex task requiring multiple interdependent steps whose potential outcomes are very, very important.  Maybe it's time for the radiation folks to give Dr. Pronovost a call.

(I received an unsolicited free copy of Checklist Manifesto from the publisher.)

Tags

More like this

Is this related to the study that some noise was being made about a year or two back, involving nurses getting treatment checklists (I forget now the details) that had to be stopped because someone decided it violated some ethical guideline, even though it had successfully demonstrated the large number of lives & dollars saved by following these procedures?

I couldn't help but note that jerome parks ended up in this particular hospital based on a recommendation by someone at church. This must be another example of God watching out for us.

What scares the holy shit out of me about this story, is that normally I have the ability to double check everything my doctors do.

In this case, what can you as the patient possibly do to make sure things go right? How can you find out if the machines being used on you are calibrated correctly before anything is done?

By Ironic Irene (not verified) on 24 Jan 2010 #permalink

There's an old military saw: "every checklist is written in blood."

Meaning: if an item is on a checklist, it's so critical that someone else in the past has died for not checking it.

The medical field (and many others) could learn a lot by examining how things are handled on a flight deck during critical phases of flight.

Checklists are not a panacea -- but, implemented correctly, they take almost no time to complete and prevent the most common, egregious and deadly errors.

By David Mudkips (not verified) on 24 Jan 2010 #permalink

The radiation therapy article that you link to is very worrying. As a former safety engineer I'm surprised that accelerators concerned lack interlocks or inbuilt alarms that would at least have to be manually reset before a dangerous run would be allowed to commence, as in the two stories referred to.

Often they were due to simple errors, such as failing to re-check a setting on a machine. This seems like just the sort of error that would be amenable to a checklist system.

They might also be amenable to automation of some sort. An engineering solution, backed up by regular checks of its function, might be preferable to a solution that relies on humans to implement it. No matter how careful you are, how rigorously you stick to the checklist, it's all too easy for something to go wrong if you're relying on one person's unsupported cerebral cortex to get everything right.

They might also be amenable to automation of some sort. An engineering solution, backed up by regular checks of its function, might be preferable to a solution that relies on humans to implement it.

There are some pretty profound social/psychological obstacles there.

By D. C. Sessions (not verified) on 25 Jan 2010 #permalink

There are some pretty profound social/psychological obstacles there.

Could you give some examples? Maybe they can be overcome.

Dianne: knowledge of past failures, perhaps? Interestingly, in the organization with perhaps the most visceral understanding of the price of failure, man-in-the-loop is mandatory. That would be the military. For any device designed to kill people, a man (or woman) has to be in the control loop, authorizing it to proceed. This is because while people aren't perfect, neither are the machines they build. But since they have different imperfections, hopefully they can catch one another's errors. But obviously they don't always.

The Therac-25 scandal is one of the more infamous examples, all down to what amounts to a race condition. One of the classic types of software bug, and one which under no circumstances should've gotten out "into the wild".

I have no doubt whatsoever that despite the Therac-25 case being brought up in nearly every computer science curriculum, it will be repeated. It pains me deeply to see that this has already happened, and in a manner so similar. Do people not learn *anything* from the mistakes of the past?

By Calli Arcale (not verified) on 25 Jan 2010 #permalink

This is because while people aren't perfect, neither are the machines they build. But since they have different imperfections, hopefully they can catch one another's errors.

That's how I would hope it would work in medicine. To use an example already in place, the computer at the hospital I work in automatically checks for bad drug interactions and/or allergies. If they are found, it flags the prescriber, who can overrule or correct, as appropriate (i.e. sometimes you really did mean to prescribe both aspirin and coumadin.)

The problem I have with check lists is that they can easily become just another form to be filled out. Again, a local example: we use a pre-procedure check list that is designed to make sure that any procedure being performed is done on the right patient and, if appropriate, the right body part or side (i.e. to avoid amputating the wrong limb). Great in theory, but in practice it often gets filled out in a random sort of way and is sometimes applied to completely inappropriate situations. (There is no correct side from which to perform a bone marrow biopsy except in special circumstances.) I particularly worry that if too many checklists are used they'll be filled out by rote without verifying the conditions that are supposed to be checked. So if the details can be simplified or handled automatically that might be of benefit in some circumstances.

I read the NYT article and was horrified at the damage wrought to those people and nearly as horrified at the quality of the software and cross check process.

Having spent most of my life involved in aviation, I can attest to the value of checklists. That said, not all situations have checklists and then experience comes into play. It should be that way for medical checklists as well.

Aircraft software is a lot more reliable than how the Varian software is described. That seems an area where the FDA could stiffen the rules. Lives are at stake, after all.

Another difference between aviation and medicine is checkrides. This is one way checklists don't become simply rote exercises. Pilots get checkrides where they have to demonstrate checklist discipline in addition to basic piloting skills, perhaps some medical specialties would benefit from that as well.

And then there is the enlightened self-interest angle for pilots to follow their checklists, because doctors bury their mistakes and pilots are buried in theirs.

By The Gregarious… (not verified) on 25 Jan 2010 #permalink

And then there is the enlightened self-interest angle for pilots to follow their checklists, because doctors bury their mistakes and pilots are buried in theirs.

Can I steal this line?

@Dianne

Feel free, I can't claim to be the originator.

By The Gregarious… (not verified) on 25 Jan 2010 #permalink

This sort of failure almost screams "engineers were ignored". That is, any engineer involved (and there must have been plenty) would have recognized the problem instantly, and would have spoken up, and probably suggested a way to make the mistake impossible. As I understand it, though, at medical equipment manufacturers, as at pharmaceutical companies, MDs are King, and everybody without some sort of doctorate, combined, has no more influence than the janitor.

This is a case of Management Failure, but I'm sure everybody in a position to fix it sees it as a technical problem that nobody could have foreseen.

By Nathan Myers (not verified) on 26 Jan 2010 #permalink

I'm a military pilot with a little over 2000 hours of flight time, including combat time in southwest asia. I'm also a medical student in the U.S. It shocks and saddens me that checklists are not universal in hospitals. My generation hears urban legends about the pilots of yore (like the 'Nam guys) who flew "cowboy ops," doing whatever the hell they pleased... well we crashed a lot of planes that way. And according to a large amount of data, we're killing a lot of patients that way, still.

I see a lot of doctors who mock the idea of checklists, but I think opinion is shifting. It heartens me that some of the bloggers I enjoy the most on this site appear to be in favor of checklists, in some manner or other.

Checklists are not a panacea, though. They are enormously beneficial in many situations, but by themselves they are not enough. They're just a small part of the CRM picture. Providers in all positions need to be trained with tools such as better communication techniques, better hand-off protocol, and leadership skills (hint: don't throw the scalpel across the room in anger). Institutions need to get a better grip on the balance between fatigue management and educational opportunities. Medicine as a whole puts itself on a pedestal and often refuses to learn from others. Solutions are out there, but we're slow to learn...