Thursday, 30 November 2017

"Mind the gap"-style simple information

Mind the gap - on London Tube logo
If there's one thing I've learnt about content and navigation in 2 years of user testing, it's:

Keep content and design simple if you want as many people as possible to understand your stuff.

Let's be clear: the kind of content I'm talking about is not novels, not newspaper articles, not gaming or funky stuff. I'm talking about practical content that gives customers answers to questions or helps them do things like find a place to study or find out their rights.

Can your customers find answers first go?

Working recently for a government agency, our mission was to redesign information so the public can:
  • find the right information easily
  • understand information first go, so they don't need to phone the helpline to double check
That way, we can keep the helpline free for people with complex queries and people who need emotional support.

People want Yes/No answers

When user testing legal advice that we thought we'd simplified, we realised it still wasn't clear. The public wanted black and white, Yes or No answers. Not the shades of grey we'd accidentally presented them with. 

And if there isn't a proper Yes or No answer because the law is vague, then it's best to say so rather than leave people wondering.

'Mind the gap' style information

Our favourite moment of user testing was when one person said:

"I want information like 'Mind the gap' when you get on the Tube. It's short. It's factual. And it does the job. It's all you need at the time."
Another person said:
"I like it to be just 2 or 3 lines, which is less daunting. If there's lots of writing, I struggle to keep my attention long enough".

Not everybody scrolls

Government agencies need to be as inclusive as possible so we recruited people who:
  • were not tech savvy 
  • had problems reading and writing
  • had problems concentrating for long periods of time
We found that people who are not tech savvy didn't scroll or explore for answers. They just gave up. They may never work out the structure of your website - they may not realise there are other sections to look through.

Big text is good

"I like big headings, big buttons"
"At home, my friend helped me set up big text and big icons"

Hover-over menus can confuse

On desktop, we found that hover-over menus are learnable by some people but hard to get right for all users. Several people found the menus unpredictable:

"It's doing that thing again. It keeps popping up and I haven't done anything."

"Here we go again." 

Back button as comfort blanket

The browser's Back button seemed to be a comfort blanket for many people.

"I don't want an app"

Several people were pleased that we offered a responsive website - a site that adapts to the smartphone screen:

"I can't be bothered to download an app. It makes you download one and then you never use it again."


How to user test content

Rather than simply showing people some content and asking them what they think, it's better to get people using the content as if they were looking something up at home or at work.

We found the best way to test content was to give people a scenario and questions, and see if they could find the right answers using our prototype.

Need more advice on how to user test content?

Article: Practical advice for testing content on websites by Nielsen Norman Group


Podcast: Designing with a content first approach by Steph Hay (on Jared Spool's User Interface Engineering site, uie.com)

Tuesday, 14 February 2017

Death by leaflet: information overload

Information overload
I recently watched my mum, 84, struggle through a pile of printouts and leaflets to make sense of a forthcoming hospital appointment. (Don't worry - I helped her.) 

To prepare for the appointment, she needed to stop eating certain things and start taking medication at strict intervals.

Volume of information

What overwhelmed us was the sheer volume of information: 6 printouts, leaflets and letters totalling around 7,500 words. That's almost a dissertation's worth of important information.

And you're given the bunch of leaflets at a time when you're already stressed because you're ill. So it's harder to absorb information. 

Duplicate information

The information in some leaflets duplicated stuff in others. It was confusing: 
"Haven't I just read that somewhere else, or is it different?"  

Contradiction

Leaflets contradicted each other on food advice.  It’s either OK to drink milk in the run-up to your appointment or it’s not ("black tea only"). To clarify whether mum could have a proper cuppa, I phoned the helpline.

Ambiguity

Ambiguity kept us on our toes: “Unless your doctor has advised otherwise, take the medication in 2 doses…” Our GP hasn't told us, so do we take the medication in 2 doses spread over 2 days or do we take it all the same day?

Tiny font

One of the leaflets had tiny writing. Font size 10 or 8. Squinting at tiny text when your eyesight is buggered* only added to the anxiety. (It was an instruction leaflet that came inside a pack of medication.)

Confusing graphics

One leaflet helpfully listed the foods it was OK to eat in the run-up to an appointment. But it used tiny graphics to depict the foods and the shapes were confusing.

“Is that a cloud? No, it’s a cupcake!”  

Litres or pints

The instructions on making up the medication (you had to add water to a powder) were in litres, so I converted them to pints as that’s the unit mum uses. When you're tired and ill, instructions that use an unfamiliar unit of measurement just add to the confusion.

Some good points too

I’m not criticising all NHS patient leaflets. It’s clear that a lot of work goes into making complicated information simple.

For us, the best leaflets presented information in a digestible format, using bullets and tables rather than long paragraphs. And we liked clear headings to help us skip irrelevant sections.


Personalised information

Ideally, you should never receive irrelevant information. If the NHS is moving from 'one size fits all' to 'personalised medicine', it's time to tailor information to individual patients. Create information in chunks that you can combine together for each individual.

Handwritten checklist

The way we coped with 7,500 words was to read everything then summarise into a single, handwritten checklist. To have everything on a single sheet of paper was a relief. We closed the leaflets, printouts and letters - no need to keep rechecking them.

NHS 4ever

By the way, I'm in no way criticising NHS staff. They were kind, caring, professional. 

I’m simply saying that with printed patient information, it's time to make it more usable. Let's provide a better user experience that matches the compassionate and efficient experience we got at the hospital.

UX starts earlier than you think

As usability/user experience guru Rolf Molich reminds us, user experience starts when you take a product out of its box or make an enquiry before purchasing. UX spans all customer 'touchpoints'. So in my situation, the user experience started with all the printouts we had to struggle through - things could have been so much smoother.

* technical term


Thursday, 7 January 2016

Spotting other people's mistakes - and saving lives in the NHS

How good are you at spotting mistakes? You may not notice your own but you're probably good at spotting other people's.

In jobs where mistakes can cause harm, it's vital to find ways of avoiding them or lessening their impact. This is what I spent last summer researching for my MSc in human computer interaction (HCI).

Accidental overdoses

The workplaces I looked at were NHS hospitals and people's mistakes were accidental drug overdoses. Specifically, accidental overdoses where staff were using machines (infusion devices - see pictures) to give patients a steady dose of drugs, blood, hormones or food over a period of time. It could be over 20 minutes or 12 hours.

I also looked at 'underdoses' which can be equally harmful: imagine you need a steady dose of insulin, liquid food or painkillers, and you don't receive it.

Sharp end, blunt end

I say 'mistakes' but in HCI you learn the user's never to blame. When mistakes happen, it's down to bad equipment design or a perfect storm of events.

James Reason, an expert in human error, distinguishes between the 'sharp end' - frontline staff who come into contact with patients and are often blamed for errors - and the 'blunt end', senior managers and policies that create conditions for mistakes to happen. Sharp end and blunt end are mentioned in Reason's 1995 article Understanding Adverse Events. 

7 years of incidents

I was lucky enough to get my hands on real NHS data - 7 years' worth of incidents from NHS hospitals and care homes that my supervisor, UCL's Ann Blandford, was guarding closely. Ann gave me a password-protected data stick with an Excel spreadsheet of 8,000 incident reports on infusion devices. The reports were written by medical staff, from healthcare assistants to nurses to anaesthetists.

I couldn't read all 8,000 so Ann showed me how to sample the reports systematically to avoid bias and ensure my study was as objective as could be.

We are detectives

In June 2015 I started reading reports and dived into a world of busy hospital wards, formal procedures, bleeping medical equipment and stressed staff. It felt like playing detective: What happened here? Was the patient OK? Who did what? Is this person trying to point the finger of blame at a colleague?

I read, re-read and made notes, waiting for ideas to leap out at me. This method's called 'grounded theory', described by professor Kathy Charmaz in her helpful book Constructing Grounded Theory.

Many of the incident reports were disappointingly brief, missing out vital details. I had to be careful not to jump to conclusions and see things that weren't actually there. My supervisor kept me on track, constantly asking for evidence to back up my hunches.

Clock on, spot error

Pretty soon, a pattern emerged. Nurses were coming on to their shift and noticing machines had been programmed with the wrong dose of pain relief or antibiotics. Nurses were noticing the previous shift's 'programming errors'. It's what James Reason calls unsurprisingly "fresh eyes".


Three-Mile Island

In his 1990 book Human Error, Reason cites the 1979 accident at the Three Mile Island nuclear power plant in the US, where the fresh eyes of a supervisor on an incoming shift diagnosed the problem after colleagues on the previous shift were unable to diagnose it correctly.

Tip of the iceberg

To find more evidence for the incoming shift spotting the previous shift's errors, I sampled around 400 reports. One challenge was that many reports were irrelevant: issues such as a lack of available equipment, dirty machines, broken machines.

But I found enough reports to provide evidence for my theory: nurses spotting errors while carrying out routine checks as part of normal duties or checking patients off their own bat. And when you think that hospital incidents are vastly underreported (Billings, 1998, 'Incident reporting systems in medicine') then this could be the tip of the iceberg.

Design recommendations

So if nurses are good at spotting each other's errors when they walk round the wards, why not encourage them to do it more often? Why not increase staffing levels so that nurses can make ward rounds every hour or so? In the context of today's NHS trust budget deficits, this suggestion would probably not go down well.

Another suggestion would be to make it easier to spot errors by making information more 'in your face'. Nurses diagnose errors by comparing a patient's prescription (on a chart or in notes) with the electronic display on an infusion device. Could you make these 2 things more obvious so any mismatch stands out?

I know that spotting programming errors - noticing an ongoing drug overdose and fixing it - is obviously not as good as preventing an error happening in the first place. But to prevent errors happening, you need to understand why they happen - and that's something you can't tell from these brief incident reports, you'd need to be there on the wards shadowing staff and interviewing them.

Robot drug dispensers

If the NHS had the time, money and good project management, it could automate the dispensing of drugs and get all medical systems talking to each other - patient's notes, prescription, hospital pharmacy, barcode on medicine, infusion device - so that, in theory there would be less room for error. It's not foolproof - automation might turn out to have a bad knock-on effect - but it might be worth a try.

Distilled dissertation

So this is my 16,000-word dissertation - a 3-month fascinating investigation last summer - distilled down. I've written this in as normal English as I could muster after a year of using HCI jargon.

I've missed out many stages of the research - it wasn't as quick and easy as I've made out. So here's the full dissertation: The detection of errors in infusion rates on infusion devices: an analysis of incident reports from the National Reporting and Learning System (NRLS).

You spent a whole 3 months proving that?

And you may be thinking: "This 'fresh eyes' theory and nurses spotting each other's errors - it's all common sense isn't it? I could have told you that and saved you 3 months' work." Well, you're kind of right! But HCI research is often about proving one tiny thing everybody takes for granted but no one has actually proved. So that's why I did it.