Thursday, 5 April 2012

KISS: Keep It Simple, Sam!

Tony Hoare is credited with claiming that... "There are two ways of constructing a software design; one way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies. The first method is far more difficult." Of course, he is focusing on software: on whether it is easy to read or test, or whether it is impossible to read (what used to be called "spaghetti code" but probably has some other name now), and impossible to devise a comprehensive set of tests for.

When systems suffer "feature creep", where they acquire more and more features to address real or imagined user needs, it's nigh on impossible to keep the code simple, so inevitably it becomes harder to test, and harder to be confident that the testing has been comprehensive. This is a universal truth, and it's certainly the case in the design of software for infusion devices. The addition of drug libraries and dose error reduction software, and the implementation of multi-function systems to be used across a range of settings for a variety of purposes, makes it increasingly difficult to be sure that the software will perform as intended under all circumstances. There is then a trade-off between delivering a timely system, or delivering a well designed and well tested system... or delivering a system that then needs repeated software upgrades as problems are unearthed. And you can never be sure you've really found all the possible problems.

These aren't just problems for the software: they're also problems for the users. When software upgrades change the way the system performs, it's difficult for the users to predict how it will behave. Nurses don't have the mental resources to be constantly thinking about whether they're working with the infusion device that's running version 3.7 of the software or the one that's been upgraded to version 3.8, or to anticipate the effects of the different software versions, or different drug libraries, on system performance. Systems that are already complicated enough are made even more so by such variability.

Having fought with several complicated technologies recently, my experience is not that they have no obvious deficiencies, but that those deficiencies are really, really hard to articulate clearly. And if you can't even describe a problem, it's going to be very hard to fix it. Better to avoid problems in the first place: KISS!

Saturday, 24 March 2012

"Be prepared"

We're thinking a lot about resilience at the moment (what it is, what it is not, how it is useful for thinking about design and training). A couple of years ago, I went climbing on Lundy. Beautiful place, highly recommended, though prone to being wet. Lundy doesn't have a climbing equipment shop, so it's important that you have everything with you. And because most of the climbing is on sea cliffs, if you drop anything you're unlikely to be able to retrieve it. So take spares: that's recognising a generic vulnerability, and planning a generic solution. In particular, I had the foresight to take a spare belay plate (essential for keeping your partner safe while climbing). This is an anticipatory approach to resilience for the "known unknowns": first recognise a vulnerability, and then act to reduce the vulnerability.

It happened: when I was half way up the Devil's Slide, my partner pulled the rope hard just as I was removing it from the belay plate, and I lost my grip... and watched the belay plate bounce down the rock to a watery grave in the sea 30m below. That's OK: I had a spare. Except that I didn't: the spare was in my rucksack at the top of the cliff. Fortunately, though, I had knowledge: I knew how to belay using an Italian Hitch knot, so I could improvise with other equipment I was carrying and keep us safe for the rest of the climb. This is a different kind of resilience: having a repertoire of skills that can be brought to bear in unforeseen circumstances, and having generic tools (like bits of string, penknives, and the like) that can be appropriated to fit unexpected needs.

This is a "boy scout" approach to resilience: for the "unknown unknowns" that cannot be anticipated, it's a case of having skills that can be brought to bear to deal with the unforeseen situation, and tools that can be used in ways that they might not have been designed for.

Thursday, 15 March 2012

Undies in the safe

Some time ago, I went to a conference in Konstanz. I put a few item in the room safe (££, passport, etc.)... and forgot to remove them when I checked out. Oops! Rather inconvenient!

This week, I've been in Stuttgart. How to make use of the room safe while also being sure to remember those important items when I check out? Solution: put my clean underwear for the last day in the safe with the higher-value items. No room thief would be interested in the undies, but I'm not going to leave without them, am I? That worked! It's an example of what we're currently calling a "resilient strategy": we're not sure that that's the right term, so if you (the reader) have better ideas, do let me know. Whatever the word, the important idea is that I anticipated a vulnerability to forgetting (drawing on the analogy of a similar incident) and formulated a way of reducing the likelihood of forgetting, by co-locating the forgettable items with some unforgettable ones.

The strategy worked even better than expected, though, because I told some people about what I'd done (to illustrate a point about resilience) while at the conference. And on my last evening, I was in the lift with another attendee. His parting words were: "don't forget your knickers!" In other situations, that could have been embarrassing; in the context, it raised some smiles... and acted as a further external memory aid to ensure that I remembered not just my clothing, but also the passport and sterling cash that I'd been storing in the safe. Other people engaging with a problem can make the system so much more resilient too!

A black & white regulatory world

I've just come home from MedTec Europe. The Human Factors stream was very interesting, with some great talks. However, the discussion focused largely on safety, error and legislation. This focus is important, but if it becomes the sole focus then all innovation is stifled. All everyone will do is to satisfy the requirements and avoid taking risks.

While it is a widespread and widely agreed aim of all medical interventions to “do no harm”, any intervention carries some small risk of harm, and medical progress requires that we accept those risks. So “no harm” has to be balanced by “where possible, do good” (where “good” is difficult to measure, though concepts such as Quality Life Years, or QaLYs, try to capture this idea). Without risk, we would have no interventions – no medical profession, no drugs, no treatments. That is unimaginable. So we need to have mature debate about acceptable risk. The world is not black-and-white… but every new piece of regulation reduces the shades of grey that are acceptable.

Imagine that universities changed their assessment to a simple pass or fail. What information does this give the future employers of our students about which are likely to perform well? Of course, academic excellence isn't the only assessment criterion, but if it's not a criterion at all then why do we assess it? More to the point, if it became a simple pass-fail, what motivation would there be for students to excel? The canny student would do the minimum to pass, and enjoy themselves (even) more. The pass-fail shows whether work basically conforms to requirements or not. The more detailed grading gives an indication of how well the work performs: work that was awarded a mark of 91% has been assessed as being of substantially higher quality than work that was awarded 51% even though both have passed. Even this is a fairly blunt instrument, and I am certainly not suggesting that medical devices be graded on a scale of 1 to 100. Quite apart from anything else, the best inhaler on the market for a young boy suffering from asthma is unlikely to also be the most appropriate for an elderly lady suffering from COPD.

Regulation is a very blunt instrument, and needs to be used with care. We also need to find ways to talk about the more complex (positive) qualities of next-generation products: risks are important, but so are benefits.

Saturday, 10 March 2012

Attitudes to error in healthcare: when will we learn?

In a recent pair of radio programmes, James Reason discusses the possibility of a change in attitude in the UK National Health Service regarding human error and patient safety. The first programme focuses on experiences in the US, where some hospitals have shifted their approach towards open disclosure, being very open about incidents with the affected patients and their families. It shouldn't really be a surprise that this has reduced litigation and the size of payouts, as families feel more listened to and recognise that their bad experience has at least had some good outcome in terms of learning, to reduce the likelihood of such an error happening again.

The second programme focuses more on the UK National Health Service, on the "duty of candour" and "mandatory disclosure", and the idea of an open relationship between healthcare professional and patients. It discusses the fact that the traditional secrecy and cover-ups lead to "secondary trauma", in which patients' families suffer from the silence and the frustration of not being able to get to the truth. There is of course also a negative effect on doctors and nurses who suffer the guilt of harming someone who had put their trust in them. It wasn't mentioned in the programme, but the suicide of Kim Hiatt is a case in point.

A shift in attitude requires a huge cultural shift. There is local learning (e.g. by an individual clinician or a clinical team) that probably takes effect even without disclosure, provided that there is a chance to reflect on the incident. But to have a broader impact, the learning needs to be disseminated more widely. This should lead to changes in practice, and also to changes in the design of technology and protocols for delivering clinical care. This requires incident reporting mechanisms that are open, thorough and clear. Rather than focusing on who is "responsible" (with a subtext that that individual is to blame), or on how to "manage" an incident (e.g. in terms of how it gets reported by the media), we will only make real progress on patient safety by emphasising learning. Reports of incidents that lay blame (e.g. the report on an unfortunate incident in which a baby received an overdose) will hardly encourage greater disclosure: if you fear blame then the natural reaction is to clam up. Conversely, though, if you clam up then that tends to encourage others to blame: it becomes a vicious cycle.

As I've argued in a recent CS4FN article, we need a changed attitude to reporting incidents that recognises the value of reporting for learning. We also need incident reporting mechanisms that are open and effective: that contain enough detail to facilitate learning (without compromising patient or clinician confidentiality), and that are available to view and to search, so that others can learn from every unfortunate error. It's not true that every cloud has a silver lining, but if learning is effective then it can be the silver lining in the cloud of each unfortunate incident.

Sunday, 26 February 2012

Ordering wine: the physical, the digital and the social

For a family birthday recently, we went to Inamo. This is not a restaurant review, but reflections on an interactive experience.

Instead of physical menus and a physical waiter, each of us had a personal interactive area on the tabletop that we used to send our individual order to the kitchen and do various other things. In some ways this was great fun (we could have "tablecloth wars" in which we kept changing the decor on the table, or play games such as Battleships across the table).

In other ways it was quite dysfunctional. For example, we had to explicitly negotiate about who was going to order bottles of water and wine because otherwise we'd have ended up with either none or 5 bottles. In most restaurants, you'd hear whether it's been ordered yet or not, so you know how to behave when it's your turn to order. But it's more subtle than that: whereas with physical menus people tend to hold them up so that they are still "in the space" with their party, with the tabletop menus people were heads-down and more engrossed in ordering from the menu than the company, and there was no external cue (the arrival of the waiter) to synchronise ordering. So the shift from the physical to the digital meant that some activities that used to be seamless have now become seamful and error-prone. The human-human coordination that is invisible (or seamless) in the physical world has to be made explicit and coordinated in the digital. Conversely, the digital design creates new possibilities that it would be difficult to replicate in the physical implementation.

There is a widespread belief that you can take a physical activity and implement a digital solution that is, in all respects, the same or better. Not so: there are almost always trade-offs.

Saturday, 18 February 2012

Device use in intensive care

Atish Rajkomar's study of how infusion devices are used in intensive care has just been accepted for publication in the Journal of Biomedical Informatics: a great outcome from an MSc project!

It's a great achievement for someone without a clinical background to go into such a complex clinical environment and make sense of anything that's going on there. The Distributed Cognition approach that Atish took seems to have been a help, providing a way of looking at the environment that focuses attention on some of the things that matter (though maybe overlooking other things in the process). But this is a difficult thing to prove!

It's one of the real challenges for the design of future healthcare technologies: that to design effectively, the design team really does need dual expertise: in technology design and in clinical work. There are few courses available that provide such dual expertise. And also surprisingly few people seem to be interested in acquiring such expertise. Therein lies another challenge: how to make healthcare technologies interesting and engaging?