Tuesday 20 August 2013

Hidden in full view: the daft things you overlook when designing and conducting studies

Several years ago, when Anne Adams and I were studying how people engaged with health information, we came up with the notion of an "information journey", with three main stages: recognising an information need; gathering information and interpreting that information. The important point (to us) in that work was highlighting the important of interpretation: the dominant view of information seeking at that time was that if people could find information then that was job done. But we found that an important role for clinicians is in helping lay people to interpret clinical information in terms of what it means for that individual – hence our focus on interpretation.

In later studies of lawyers' information work, Simon Attfield  and I realised that there were two important elements missing from the information journey as we'd formulated it: information validation and information use. When we looked back at the health data, we didn't see a lot of evidence of validation (it might have been there, but it was largely implicit, and rolled up with interpretation) but – now sensitised to it – we found lots of evidence of information use. Doh! Of course people use the information – e.g. in subsequent health management – but we simply hadn't noticed it because people didn't talk explicitly about it as "using" the information. Extend the model.

Wind forwards to today, and I'm writing a chapter for InteractionDesign.org on semi-structured qualitative studies. Don't hold your breath on this appearing: it's taking longer than I'd expected.

I've (partly) structured it according to the PRETAR framework for planning and conducting studies:
  • what's the Purpose of the study?
  • what Resources are available?
  • what Ethical considerations need to be taken into account?
  • what Techniques for data gathering?
  • how to Analyse data?
  • how to Report results?
...and, having been working with that framework for several years now, I have just realised that there's an important element missing, somewhere between resources and techniques for data gathering. What's missing is the step of taking the resources (which define what is possible) and using them to shape the detailed design of the study – e.g., in terms of interventions.

I've tended to lump the details of participant recruitment in with Resources (even though it's really part of the detailed study design), and of informed consent in with Ethics. But what about interventions such as giving people specific tasks to do for a think-aloud study? Or giving people a new device to use? Or planning the details of a semi-structured interview script? Just because a resource is available, that doesn't mean it's automatically going to be used in the study, and all those decisions – which of course get made in designing a study – precede data gathering. I don't think this means a total re-write of the chapter, but a certain amount of cutting and pasting is about to happen ...

Tuesday 13 August 2013

Wizard of Oz: the medium and the message

Last week, one of my colleagues asserted that it didn't matter how a message was communicated – that the medium and the message were independent. I raised a quizzical eyebrow. A few days previously, I'd been in Vancouver, and had visited the Museum of Anthropology. It's a delightful place: some amazing art and artefacts from many different cultures. Most of them relate to ceremony and celebration, rather than everyday life, but they give a flavour of people's cultures, beliefs and practices. And most of them are beautiful.

One object that caught my attention was a yakantakw, or "speaking through post". According to the accompanying description: "A carved figure such as this one, with its prominent, open mouth, was used during winter ceremonies. A person who held the privilege of speaking on behalf of the hosts would conceal himself behind the figure, projecting his voice forward. It was as though the ancestor himself was calling to the assembled guests." This particular speaking through post dates from 1860, predating the Wizard of Oz by about 40 years.

In HCI, we talk about "Wizard of Oz experiments" in which participants are intended to believe that they are interacting with a computer system when in fact they are interacting with a human being who is hiding behind that system. It matters that people think that they are interacting with a computer rather than another human being. The analogy with the Wizard of Oz is quite obvious. But is looks like the native people in that region beat L. Frank Baum to the idea, and we should really be calling them "Yakantakw experiments". Just as soon as soon as we Western people learn to pronounce that word.