Sunday 18 March 2018

Invisible work

I have been on strike for much of the past four weeks – at least notionally. The truth is more nuanced than that, because I don't actually want my students and other junior colleagues to be disadvantaged by this action. I am, after all, fighting for the future of university education: their future. Yet I do want senior management and the powerful people who make decisions about our work and our pensions to be aware of the strength of feeling, as well as the rational arguments, around the pensions issue.

There have been some excellent analyses of the problem, by academic experts from a range of disciplines. Here are some of my favorites (in no particular order):
As well as standing on picket lines, marching, discussing the issues around the strike, and not doing work that involves crossing said picket lines, I have continued to do a substantial amount of work. It has made me think more about the nature of invisible work. Bonnie Nardi and Yrjo Engestrom identify four kinds of invisible work:
  1. work done in invisible places, such as the highly skilled behind-the-scenes work of reference librarians; to this I would add most of the invisible work done by university staff, out of sight and out of hours.
  2. work defined as routine or manual that actually requires considerable problem solving and knowledge, such as the work of telephone operators; don't forget completing a ResearchFish submission or grappling with the Virtual Learning Environment or many other enterprise software systems.
  3. work done by invisible people such as domestics, (and sometimes Athena SWAN teams!)
  4. informal work processes that are not part of anybody’s job description but which are crucial for the collective functioning of the workplace, such as regular but open-ended meetings without a specific agenda, informal conversations, gossip, humor, storytelling.
The time for (4) has been sadly eroded over the years as demands and expectations have risen without corresponding rises in resourcing.

To these, I would add the invisible work that is invisible because it is apparently ineffectual. E.g., I wrote to our Provost about 12 days ago, but I have no evidence that it was read; it certainly hasn't been responded to in any visible way (reproduced below for the record).

The double-think required to simultaneously be on strike while also delivering on time-limited commitments to colleagues and students has forced me to also develop new approaches to revealing and hiding work. For example, 
  • I have started logging my own work hours so that the accumulated time is visible to me, and although I've been working way more than the hours set out in the Working Time Directive, I'm going to try to bring the time worked down to comply with that directive. This should help me say "no" more assertively in future. That's the theory, at least...
  • I have started saving emails as drafts so as not to send them "out of hours". There are 21 items in my email outbox as I type this; I'll look incredibly productive first thing on Monday morning!
And finally, I will make visible the letter I wrote to the Provost:


Thank you for this encouraging message last week. You are right that none of us takes strike action lightly. We all want to be doing and supporting excellent teaching, research and knowledge transfer, but we are extremely concerned about the proposed pension changes, and we have found no other way to be heard.

I’ve worked in universities since 1981 and this is the first time I have taken strike action. The decision to strike has been one of the harder decisions I have taken in my professional career, but I think the impact of the proposed pension changes on our junior colleagues (and hence on the future of universities) is unacceptable, and I am not persuaded that a DB scheme is unaffordable.

Please continue to work with the other university leaders to find an early resolution to this dispute. UCL isn’t just estates and financial surplus: as you say, it’s a community of world-leading, committed people who work really hard, and who merit an overall remuneration package that is reflective of that. That includes pensions that aren’t a stock market lottery for each individual.

I’d like to be in my office meeting PhD students and post-docs next Monday morning, and in a lecture theatre with my MSc students on Monday afternoon. Please do everything in your power to bring this dispute to a quick resolution so that there’s a real possibility that “normal service” can be resumed next week.

Sunday 4 March 2018

How not to design the user experience: update 2018

In November 2014, I wrote a summary of my experience of entering research data in Researchfish. Since then, aspects of this system have improved: at least some of the most obvious bugs have been ironed out, and being able to link data from ORCID makes one tedious aspect of the task (entering data about particular publications) significantly easier. So well done to the ResearchFish team on fixing those problems. It's a pity it's still not fit for purpose, despite the number of funders who are supporting (mandating) use of this system.

The system is still designed without any consideration of how people conceptualise their research outputs – or at least, not how I do. According to ResearchFish, it takes less than a lunchbreak to enter all the data. There are two problems with this:
1. Few academics that I know have the time to take a lunch break.
2. So far, today, it has taken me longer than that just to work out a strategy for completing this multi-dimensional task systematically. It's like 3-D Sudoku, but less fun.

Even for publications, it's a two-dimensional task: select publications (e.g., from ORCID) and select grants to which they apply. But if you just do this as stated, then you get many-to-many relationships, with every publication assigned to grants that it isn't associated with as well as one(s) it is. And yes, I have tested this. So you have to decide which grant you're going to focus on, then go through the list and add those... then go around the loop (add new publications > select ORCID > search > select publications > select grant) repeatedly for all grants. Maybe there's a faster way to do it, but I haven't discovered that yet. Oh: and if you make a mistake, there isn't an easy way to correct it, so there is probably over-reporting as well as under-reporting on many grants.
I'm still trying to guess what "author not available" means in the information about a publication. My strategy for working out which paper each line refers to has been to keep Google Scholar open in parallel and search for the titles there, because those make more sense to me.

In the section on reporting key findings of a grant, when you save the entry, it returns you to the same page. Why would you want to save multiple times, rather than just moving on to the next step? Why isn't there a 'next' option? And why, when you have said there is no update on a completed grant, does it still take you to the update page? What was the point of the question?

When you're within the context of one award, and you select publications, it shows all publications for all awards (until you explicit select the option to focus on this award). Why? I'm in a particular task context...

When you're in the context of an award where you are simply a team member, you can filter by publications you've added, or by publications linked to this award, but not by publications that you've added that are also linked to this award. Those are the ones that I know about, and the ones that I want to check / update.

Having taken a coffee break, I returned to the interface to discover I had been logged out. I don't actually know my login details because the first time I logged in this morning I did so via ORCID. That option isn't available on the login page that appears after time-out. This is further evidence of poor system testing and non-existent user testing.

I could go on, but life is too short. There is no evidence of the developers having considered either conceptual design or task structures. There is no evidence that the system has actually been tested by real users who have real data entry tasks and time constraints. I really cannot comprehend how so many funders can mandate the use of a system that is so poorly designed, other than because they have the power to do so.