Showing posts with label resilience. Show all posts
Showing posts with label resilience. Show all posts

Thursday, 18 July 2013

When reasoning and action don't match: Intentionality and safety

My team have been discussing the nature of “resilient” behavior, the basic idea being that people develop strategies for anticipating and avoiding possible errors, and creating conditions that enable them to recover seamlessly from disturbances. One of the examples that is used repeatedly is leaving one’s umbrella by the door as a reminder to take it when going out in case of rain. Of course, getting wet doesn’t seriously compromise safety for most people, but let’s let that pass: its unpleasant. This presupposes that people are able to recognize vulnerabilities and identify appropriate strategies to address them. Two recent incidents have made me rethink some of the presuppositions.

On Tuesday, I met up with a friend. She had left her wallet at work. It had been such a hot day that she had taken it out of her back pocket and put it somewhere safe (which was, of course, well hidden). She recognized that she was likely to forget it, and thought of ways to remind herself: leaving a note with her car keys, for instance. But she didn’t act on this intention. So she had done the learning and reflection, but it still didn’t work for her because she didn’t follow through with action.

My partner occasionally forgets to lock the retractable roof on our car. I have never made this mistake, but wasn’t sure why until I compared his behavior with mine. It turns out he is more relaxed than I am, and waits while the roof closes before taking the next step, which is often to close the windows, take the keys out of the lock and get out of the car. I, in contrast, am impatient. I can’t wait to lock the roof as it closes, so as the roof is coming over, my arm is going up ready to lock it. So I never forget (famous last words!): the action is automatised. The important point in relation to resilience is that I didn’t develop this behavior in order to keep the car safe or secure: I developed it because I assumed that the roof needed to be secured and I wanted it to happen as quickly as possible. So it is not intentional, in terms of safety, and yet it has the effect of making the system safer.

So what keeps the system safe(r) is not necessarily what people learn or reflect on, but what they act on. This is, of course, only one aspect of the problem; when major disturbances happen, it’s almost certainly more important to consider people’s competencies and knowledge (and how they acquired them). To (approximately) quote a London Underground controller: “We’re paid for what we know, not what we do”. Ultimately, it's what people do that matters in terms of safety; sometimes that can be clearly traced to what they know and sometime it can't.


Saturday, 23 March 2013

"How to avoid mistakes in surgery": a summary and commentary

I've just returned from the US, and my one "must see" catch-up TV programme was "How to avoid mistakes in surgery" (now available on youtube). It's great to see human error in healthcare getting such prominent billing, and being dealt with in such an informative way. This is a very quick synopsis (of the parts I particularly noted).

The programme uses the case of Elaine Bromiley as the starting point and motivation for being concerned about human error in healthcare. The narrator, Kevin Fong, draws on experience from other domains including aviation, firefighting and formula one pit-stops to propose ways to make surgery and anaesthesia safer. Themes that emerge include:
  • the importance of training, and the value of simulation suites (simlabs) for setting up challenging scenarios for practice. This is consistent with the literature on naturalistic decision making, though the programme focuses particularly on the importance of situational awareness (seeing the bigger picture).
  • the value of checklists for ensuring that basic safety checks have been completed. This is based on the work of Atul Gawande, and is gaining recognition in UK hospitals. It is claimed that checklists help to change power relationships, particularly in the operating theatre. I don't know whether there is evidence to support this claim, but it is intuitively appealing. Certainly, it is important in operating theatres, just as it has been recognised as being important in aviation
  • the criticality of handovers from the operating theatre to the intensive care unit. This is where the learning from F1 pitstops comes in. It's about having a system and clear roles and someone who's in charge. For me, the way that much of the essential technology gets piled on the bed around the patient raised a particular question: isn't there a better way to do this?
  • dealing with extreme situations that are outside anything that has been trained for or anticipated. The example that was used for this was the Hudson River plane incident; ironically, on Thursday afternoon, about the time this programme first broadcast, Pete Doyle and I were discussing this incident as an example that isn't really that extreme, because the pilot has been explicitly trained in all the elements of the situation, though not in the particular combination of them that occurred that day. There is a spectrum of resilient behaviour, and this is an example of well executed behaviour, but it's not clear to me that it is really "extreme". The programme refers to the need to build a robust, resilient safety system. Who can disagree with this? It advocates an approach of "standardise until you have to improvise". This is true, but this could miss an important element: standardisation, done badly, reduces the professional expertise and skill of the individual, and it is essential to enhance that expertise if the individual is to be able to improvise effectively. I suspect that clinicians resist checklists precisely because it seems to reduce their professional expertise, when in fact it should be liberating them to develop their expertise at the "edges", to deal better with the extreme situations. But of course that demands that clinical professional development includes opportunities and challenges to develop that expertise. That is a challenge!
The programme finishes with a call to learn from mistakes, to have a positive attitude to errors. Captain Chesley 'Sully' Sullenberger talks about "lessons bought with blood", and about the "moral failure of forgetting these mistakes and having to re-learn them". `On the basis of our research to date, and of discussions with others in the US and Canada studying incident reporting and learning from mistakes, this remains a challenge for healthcare.

Wednesday, 4 July 2012

An accident: lots of factors, no blame

At one level, this is a story that has been told many times already, and yet this particular rendering of it is haunting me. I don't know all the details (and never will), so parts of the following are speculation, but the story is my best understanding of what happened, and it highlights some of the challenges in trying to make sense of human error and system design.

The air ambulance made a tricky descent. Although the incident took place near a local hospital, the casualty was badly injured and needed specialist treatment, so was flown to a major trauma centre. Hopefully, he will live.

What happened? The man fell, probably about 10 metres, as he was being lowered from the top of a climbing wall. It seems that he had put his climbing harness on backwards and tied the rope on to a gear loop (which is not designed to hold much weight) rather than tying it in correctly (through the waist loop and leg loop, which were behind him). Apparently, as he let the rope take his weight to be lowered off from the climb, the gear loop gave way.

I can only guess that both the climber and his partner were new to climbing, since apparently neither of them knew how to put the harness on correctly, and also that there was no-one else on the wall at the time (since climbers generally look out for each other and point out unsafe practices). But so many things must have aligned for the accident to happen: both climbers must have signed a declaration that they were experienced and recognised the risks; the harness in question had a gear loop at the centre of the back that they could mistake for a rope attachment point... but that loop wasn't strong enough to take the climber's weight; someone had supplied that harness to the climber without either providing clear instructions on how to put it on or checking that he knew...

So many factors: the climber and his partner apparently believed they were more expert than they actually were; the harness supplier (whether that was a vendor or a friend) didn't check that the climber knew how to use the equipment; there weren't other more expert climbers around to notice the error; the design of the harness had a usability vulnerability (a central loop that actually wasn't rated for a high load and could be mistaken for a rope attachment point); the wall's policy allowed people to self-certify as experienced without checking. Was anyone to blame? Surely not: this wasn't "an accident waiting to happen". But the system clearly wasn't as resilient as it might have been because when all these factors lined up, a young man had to be airlifted to hospital. I wish him well, and hope he makes a full recovery.

The wall has learnt from the incident and changed its admissions policy; hopefully, there will be other learning from it too to further reduce the likelihood of any similar incident occurring in the future. Safety is improved through learning, not through blaming.

Saturday, 24 March 2012

"Be prepared"

We're thinking a lot about resilience at the moment (what it is, what it is not, how it is useful for thinking about design and training). A couple of years ago, I went climbing on Lundy. Beautiful place, highly recommended, though prone to being wet. Lundy doesn't have a climbing equipment shop, so it's important that you have everything with you. And because most of the climbing is on sea cliffs, if you drop anything you're unlikely to be able to retrieve it. So take spares: that's recognising a generic vulnerability, and planning a generic solution. In particular, I had the foresight to take a spare belay plate (essential for keeping your partner safe while climbing). This is an anticipatory approach to resilience for the "known unknowns": first recognise a vulnerability, and then act to reduce the vulnerability.

It happened: when I was half way up the Devil's Slide, my partner pulled the rope hard just as I was removing it from the belay plate, and I lost my grip... and watched the belay plate bounce down the rock to a watery grave in the sea 30m below. That's OK: I had a spare. Except that I didn't: the spare was in my rucksack at the top of the cliff. Fortunately, though, I had knowledge: I knew how to belay using an Italian Hitch knot, so I could improvise with other equipment I was carrying and keep us safe for the rest of the climb. This is a different kind of resilience: having a repertoire of skills that can be brought to bear in unforeseen circumstances, and having generic tools (like bits of string, penknives, and the like) that can be appropriated to fit unexpected needs.

This is a "boy scout" approach to resilience: for the "unknown unknowns" that cannot be anticipated, it's a case of having skills that can be brought to bear to deal with the unforeseen situation, and tools that can be used in ways that they might not have been designed for.

Thursday, 15 March 2012

Undies in the safe

Some time ago, I went to a conference in Konstanz. I put a few item in the room safe (££, passport, etc.)... and forgot to remove them when I checked out. Oops! Rather inconvenient!

This week, I've been in Stuttgart. How to make use of the room safe while also being sure to remember those important items when I check out? Solution: put my clean underwear for the last day in the safe with the higher-value items. No room thief would be interested in the undies, but I'm not going to leave without them, am I? That worked! It's an example of what we're currently calling a "resilient strategy": we're not sure that that's the right term, so if you (the reader) have better ideas, do let me know. Whatever the word, the important idea is that I anticipated a vulnerability to forgetting (drawing on the analogy of a similar incident) and formulated a way of reducing the likelihood of forgetting, by co-locating the forgettable items with some unforgettable ones.

The strategy worked even better than expected, though, because I told some people about what I'd done (to illustrate a point about resilience) while at the conference. And on my last evening, I was in the lift with another attendee. His parting words were: "don't forget your knickers!" In other situations, that could have been embarrassing; in the context, it raised some smiles... and acted as a further external memory aid to ensure that I remembered not just my clothing, but also the passport and sterling cash that I'd been storing in the safe. Other people engaging with a problem can make the system so much more resilient too!