By Nick Dawe, on 4 November 2011
Have you ever left your card in a chip and pin, mistyped a phone number, left your petrol cap on the top of a car or, more worryingly, poured orange juice into your cornflakes? As somebody who recently misread a label and poured a carton of chickpeas (instead of custard) over a sticky toffee pudding, I’m particularly interested in how and why such errors can occur.
To mark World Usability Day (10 November), Professor Ann Blandford gave a clear and helpful lecture about how the design of technology can help make these errors less likely. While the mistakes made above do cause delays and aggravation, Professor Blandford began by highlighting how technology design can affect far more serious situations.
In 2010, an agency nurse was caught on CCTV turning off the life support machine for a patient, and not being able to work out how to turn the machine back on again or use the resuscitation equipment. The patient was then left trying to survive until paramedics arrived 21 minutes later.
In such a situation, where should the blame lie? At first it may seem to be with the nurse, or perhaps with the agency employing her. But what about the designers of the life support machine?
As Professor Blandford argued, a design can make errors less likely, easier to spot, easier to recover from, or simply make the consequences of these errors less catastrophic. For instance:
- To stop people leaving a petrol filler cap on top of their car, the cap could be designed to be attached to the car, so that it isn’t even possible to put it on the car roof in the first place.
- Reminders such as beeps and lights can be developed on chip and pin devices to stop people leaving their cards.
- The order of a device’s task structure may also be redesigned to help reduce errors: an ATM will return a user’s card before supplying cash, as the user is mainly thinking about collecting their money and may easily forget to take their card otherwise.
- ‘Active monitoring’ can take place, such as a fuel cap that automatically detects the type of fuel being used and alerts the user if they are putting the wrong type in.
Errors can be caused by a variety of reasons such as cognitive and perceptual distractions, or simply being in a new environment in which we’re not used to typical procedures (e.g. we may get confused about where things are in a new car and set the windscreen wipers off when meaning to signal right).
Having said all this, people do seem to adapt to improve their performance using these devices. A user may keep their hand on their card when they put it in the chip and pin to ensure that they don’t forget it. A user of a more complex device (such as a kidney dialysis machine) may actually annotate it with notes of settings.
Professor Blandford then focused back on nursing – a role in which errors can have disastrous results. Nursing, among other things, involves serious multitasking and regular interruptions, and so more serious errors may well occur. Indeed, a number of tragic examples from the US were given, in which accidental overdoses and similar errors had catastrophic results.
These kinds of incidents can occur in healthcare for many reasons. In very rare cases they may be intentional, but they may also be due to carelessness, equipment malfunctions, human error or for more complicated reasons.
The lecture ended with Professor Blandford citing the first example of the agency nurse and repeating the question of who was responsible: the nurse, those who employed her, or the designers of the technology involved. Rather than having a blame culture that focuses on particular individuals, are there more general questions we need to be asking about the nature of the errors involved?
Nick Dawe is Digital Media Manager in UCL Communications & Marketing.
- World Usability Day
- Chi+med (Computer Human Interaction for Medical Devices)
- Error diary – A dairy diary of daily failures
Watch lecture online here: