Saturday 24 March 2012

"Be prepared"

We're thinking a lot about resilience at the moment (what it is, what it is not, how it is useful for thinking about design and training). A couple of years ago, I went climbing on Lundy. Beautiful place, highly recommended, though prone to being wet. Lundy doesn't have a climbing equipment shop, so it's important that you have everything with you. And because most of the climbing is on sea cliffs, if you drop anything you're unlikely to be able to retrieve it. So take spares: that's recognising a generic vulnerability, and planning a generic solution. In particular, I had the foresight to take a spare belay plate (essential for keeping your partner safe while climbing). This is an anticipatory approach to resilience for the "known unknowns": first recognise a vulnerability, and then act to reduce the vulnerability.

It happened: when I was half way up the Devil's Slide, my partner pulled the rope hard just as I was removing it from the belay plate, and I lost my grip... and watched the belay plate bounce down the rock to a watery grave in the sea 30m below. That's OK: I had a spare. Except that I didn't: the spare was in my rucksack at the top of the cliff. Fortunately, though, I had knowledge: I knew how to belay using an Italian Hitch knot, so I could improvise with other equipment I was carrying and keep us safe for the rest of the climb. This is a different kind of resilience: having a repertoire of skills that can be brought to bear in unforeseen circumstances, and having generic tools (like bits of string, penknives, and the like) that can be appropriated to fit unexpected needs.

This is a "boy scout" approach to resilience: for the "unknown unknowns" that cannot be anticipated, it's a case of having skills that can be brought to bear to deal with the unforeseen situation, and tools that can be used in ways that they might not have been designed for.

Thursday 15 March 2012

Undies in the safe

Some time ago, I went to a conference in Konstanz. I put a few item in the room safe (££, passport, etc.)... and forgot to remove them when I checked out. Oops! Rather inconvenient!

This week, I've been in Stuttgart. How to make use of the room safe while also being sure to remember those important items when I check out? Solution: put my clean underwear for the last day in the safe with the higher-value items. No room thief would be interested in the undies, but I'm not going to leave without them, am I? That worked! It's an example of what we're currently calling a "resilient strategy": we're not sure that that's the right term, so if you (the reader) have better ideas, do let me know. Whatever the word, the important idea is that I anticipated a vulnerability to forgetting (drawing on the analogy of a similar incident) and formulated a way of reducing the likelihood of forgetting, by co-locating the forgettable items with some unforgettable ones.

The strategy worked even better than expected, though, because I told some people about what I'd done (to illustrate a point about resilience) while at the conference. And on my last evening, I was in the lift with another attendee. His parting words were: "don't forget your knickers!" In other situations, that could have been embarrassing; in the context, it raised some smiles... and acted as a further external memory aid to ensure that I remembered not just my clothing, but also the passport and sterling cash that I'd been storing in the safe. Other people engaging with a problem can make the system so much more resilient too!

A black & white regulatory world

I've just come home from MedTec Europe. The Human Factors stream was very interesting, with some great talks. However, the discussion focused largely on safety, error and legislation. This focus is important, but if it becomes the sole focus then all innovation is stifled. All everyone will do is to satisfy the requirements and avoid taking risks.

While it is a widespread and widely agreed aim of all medical interventions to “do no harm”, any intervention carries some small risk of harm, and medical progress requires that we accept those risks. So “no harm” has to be balanced by “where possible, do good” (where “good” is difficult to measure, though concepts such as Quality Life Years, or QaLYs, try to capture this idea). Without risk, we would have no interventions – no medical profession, no drugs, no treatments. That is unimaginable. So we need to have mature debate about acceptable risk. The world is not black-and-white… but every new piece of regulation reduces the shades of grey that are acceptable.

Imagine that universities changed their assessment to a simple pass or fail. What information does this give the future employers of our students about which are likely to perform well? Of course, academic excellence isn't the only assessment criterion, but if it's not a criterion at all then why do we assess it? More to the point, if it became a simple pass-fail, what motivation would there be for students to excel? The canny student would do the minimum to pass, and enjoy themselves (even) more. The pass-fail shows whether work basically conforms to requirements or not. The more detailed grading gives an indication of how well the work performs: work that was awarded a mark of 91% has been assessed as being of substantially higher quality than work that was awarded 51% even though both have passed. Even this is a fairly blunt instrument, and I am certainly not suggesting that medical devices be graded on a scale of 1 to 100. Quite apart from anything else, the best inhaler on the market for a young boy suffering from asthma is unlikely to also be the most appropriate for an elderly lady suffering from COPD.

Regulation is a very blunt instrument, and needs to be used with care. We also need to find ways to talk about the more complex (positive) qualities of next-generation products: risks are important, but so are benefits.

Saturday 10 March 2012

Attitudes to error in healthcare: when will we learn?

In a recent pair of radio programmes, James Reason discusses the possibility of a change in attitude in the UK National Health Service regarding human error and patient safety. The first programme focuses on experiences in the US, where some hospitals have shifted their approach towards open disclosure, being very open about incidents with the affected patients and their families. It shouldn't really be a surprise that this has reduced litigation and the size of payouts, as families feel more listened to and recognise that their bad experience has at least had some good outcome in terms of learning, to reduce the likelihood of such an error happening again.

The second programme focuses more on the UK National Health Service, on the "duty of candour" and "mandatory disclosure", and the idea of an open relationship between healthcare professional and patients. It discusses the fact that the traditional secrecy and cover-ups lead to "secondary trauma", in which patients' families suffer from the silence and the frustration of not being able to get to the truth. There is of course also a negative effect on doctors and nurses who suffer the guilt of harming someone who had put their trust in them. It wasn't mentioned in the programme, but the suicide of Kim Hiatt is a case in point.

A shift in attitude requires a huge cultural shift. There is local learning (e.g. by an individual clinician or a clinical team) that probably takes effect even without disclosure, provided that there is a chance to reflect on the incident. But to have a broader impact, the learning needs to be disseminated more widely. This should lead to changes in practice, and also to changes in the design of technology and protocols for delivering clinical care. This requires incident reporting mechanisms that are open, thorough and clear. Rather than focusing on who is "responsible" (with a subtext that that individual is to blame), or on how to "manage" an incident (e.g. in terms of how it gets reported by the media), we will only make real progress on patient safety by emphasising learning. Reports of incidents that lay blame (e.g. the report on an unfortunate incident in which a baby received an overdose) will hardly encourage greater disclosure: if you fear blame then the natural reaction is to clam up. Conversely, though, if you clam up then that tends to encourage others to blame: it becomes a vicious cycle.

As I've argued in a recent CS4FN article, we need a changed attitude to reporting incidents that recognises the value of reporting for learning. We also need incident reporting mechanisms that are open and effective: that contain enough detail to facilitate learning (without compromising patient or clinician confidentiality), and that are available to view and to search, so that others can learn from every unfortunate error. It's not true that every cloud has a silver lining, but if learning is effective then it can be the silver lining in the cloud of each unfortunate incident.