How Apple helped us prevent blindness through better design | DevsDay.ru

IT-блоги How Apple helped us prevent blindness through better design

UX Planet 27 ноября 2021 г. Bhargav Sosale


Takeaways from designing a medical device with Apple

Who else better to learn design from? (Photo by Trac Vu)

“I’m not happy with your device.”

I remembered Dr. Krishna telling me.

“The quality of images is not adequate for medical diagnosis.”

Dr. Krishna was part of a growing number of users who had given us this feedback. We had every reason to be worried.

Our company built medical devices that captured images of the human retina. Doctors then read these images to provide a diagnosis. Any lapse in image quality meant we risked a false diagnosis. And possibly permanent blindness.

Fortunately, the doctor’s concerns were far from the truth. Our device had been validated against the best-in-class diagnostic systems in the market. We had approvals from many regulatory agencies. For each user that complained, we had 3 that said our device produced the best images they had ever seen.

So what was the problem?

Dr. Krishna was using the device incorrectly. Like most users, he hadn’t read the user manuals or watched the training videos.

We had our team visit his clinic and, over a period of 3 days, we re-trained every member of his staff. The doctors, the nurses, the lab techs, even the receptionist.

“Thank you so much, the image quality is so much better”, he texted us the next day. Our problem was solved.

But we knew it really wasn’t. The time, effort, and expenses involved had hit our operational costs hard. We couldn’t sustainably do this for every customer. And to make matters worse, our devices were increasingly being used by non healthcare personnel. People who had never used a medical device in their lives before.

We had to fix our UX.

Software, not hardware

We knew from the outset that fixing our problem meant fixing our software. Changing the hardware meant months of RnD, regulatory barriers, and higher manufacturing costs.

I had recently joined the company as the head of software, and fixing our UX became one of my key priorities.

Our medical device was a set of custom lenses and electronics that attached to an iPhone. The camera of the phone was used to capture images of the retina.

An app was used to communicate with the hardware, and act as the primary user interface. You could tell the UX needed work.

We chose an iPhone because it had Retina display

Using an iPhone helped us in 2 ways.

  • It allowed us to tap into our network of accomplished app designers and ask for help.
  • Apple had noticed our device, and were excited by it. They reached out and offered to help.

We took them up on their offer, and we got started.

Designing a medical device for the layperson

Non-medical personnel using our device at a screening camp

“What’s your average user like? Can I presume that it’s either a doctor or a nurse?

Aniket asked me about our user persona.

“No, you can’t”, I replied.

Aniket was a design evangelist at Apple, working with the Apple Accelerator team. He’d helped dozens of companies from around the world with their design, including winners of the prestigious Apple Design Award.

We were also joined by Abhinav, a close friend of mine and the best designer I knew. Abhinav, at the time, headed the design team of one of India’s largest startups. Today, he runs one of the fastest growing design communities on the internet.

While our customers were doctors, they did not use the device. They only reviewed the images to provide a diagnosis. Their staff that used the device to capture images were often not medically trained. Some of them hadn’t graduated high school.

We filled Apple in on our user demographic; their backgrounds, motivations, and capabilities. We shared stories of our interactions with them, and helped Apple step in to our user’s world.

Our issue wasn’t our user understanding. We lacked direction in designing for them.

The only team pic we took that week. (From L-R, Rachit, Vasu, and Aniket)

We spent 3 days working with Aniket, Abhinav and the Apple Accelerator team. They helped us identify every source of complexity in our device, and then helped us design for them.

By the end of the third day, we had completely redesigned our core retinal imaging experience.

In this article, I outline our 3 biggest takeaways:

  1. Be bold
  2. Exploit familiar analogies
  3. Think beyond screens and wireframes

Essential complexity and the 80/20 Rule

Essential complexity is why pilots need a year of flight school. (Photo by Mael BALLAND)

Before jumping into our takeaways, I wanted to share a note on essential complexity.

Fred Brooks describes essential complexity as any complexity directly caused by the problem that you’re trying to solve.

This is in contrast to non-essential or accidental complexity. This is where complexity is introduced via the specifics of implementation.

Our product, being a medical device, had a large amount of essential complexity. And no one experienced it as much as our user, the layperson. To them, I imagine that our app screen looked like an airplane cockpit.

We had options to control various LED intensities, the camera ISO, zoom levels, and focus. We had modes for different pupil sizes, and infrared imaging configurations. We also had a separate mode for when eye drops were used.

Every single option had clinical relevance. Reducing complexity meant removing clinically essential functionality. It made simplifying our product seem like an impossible task.

Reducing complexity meant removing all clinically essential functionality.

The 80/20 rule, also known as the Pareto Principle, is a rule of thumb that states that users only use 20% of your product 80% of the time. Anything not in that 20% is not only just ignored, but is also usually the biggest source of UX confusion.

For us, the technical nature of medical imaging only made it much harder to identify that 20%. The clinical relevance of each option meant that removing them could risk patient outcomes.

Solving our UX meant learning to manage this complexity. We had to identify and drop anything non-essential. And we had to simplify everything essential.

To do this, we were forced to examine our biases and broaden our perspective on what good design was. Our key takeaways from the sprint helped us achieve just that.

Takeaway 1: Be bold

When I was a founder, courage came easy. We had no product, no users, and no revenue. We had nothing at stake.

So we took risks, and those risks allowed us to innovate.

Today, that wasn’t the case. Our company had 1000s of users screening millions of patients. We had many digits of revenue.

Those stakes, along with the nature of our device, made it harder to take risks. Our fear of removing something clinically essential had stopped us from removing anything at all.

We had fallen prey to the ‘focusing illusion’.

The Focusing Illusion

The focusing illusion is a cognitive bias that states:

Nothing in life is as important as you think it is, while you are thinking about it.

And this was true for our team.

It was only when we considered removing a function that we remembered its medical relevance. Even if a particular feature was rarely used, the thought of a potential consequence prevented us from removing it.

And those potential consequences always seemed more dire at the exact time we were thinking about them.

The focusing illusion had prevented us from being bold.

The last iPhone with the headphone jack. (Photo by freestocks)

“We took one of our biggest risks with our most established product. We removed the headphone jack.”

I was reminded by the Apple team. And they were right.

Apple had been bold, despite having a lot at stake. It wasn’t long before every smartphone manufacturer had also removed the headphone jack.

Apple had set industry standards by taking bold risks.

Our first step was fixing our mindset, even if it did not come to us naturally. We forced ourselves to err on the other extreme. We removed more than we needed to. We fought the focusing illusion at every step.

By the end of the first day, we had made great headway in removing any non-essential function. We had started making progress.

Takeaway 2: Exploit familiar analogies

We still needed a strategy to handle the essential complexity that came with our device.

Removing too little meant that our UX problems continued. Removing too much meant that our product was clinically irrelevant. We were unable to find the optimal balance.

It was when we stepped out for a break that the solution hit us like a brick. We had spotted a street vendor using a $100 Android phone.

Our users were confused because they had not used medical devices before. But nearly every one of them had used their smartphone camera.

Why couldn’t we use established analogies from other camera applications?

Replace, not remove

We realised that essential complexity could be simplified with familiar analogies.

Our users didn’t need to know they were adjusting an LED. They were just increasing the brightness. They didn’t need to know whether they were capturing under infrared mode. They just needed to know that the flash was off.

“Replace, not remove” quickly became our mantra. We decided to replace anything we could with the closest camera equivalent.

This approach had 2 clear benefits:

  • It reduced the amount of learning
  • It eliminated any potential confusion.

We quickly begun sketching out what this would look like; primarily deriving inspiration from the iOS and Android native camera applications. We standardised our layouts and icons as much as we can.

We had begun simplifying our essential complexity.

L — Some of our early sketches on a whiteboard. R — Some higher fidelity wireframes

Progressive disclosure

While not related to the camera analogy, I wanted to add a note about another pattern that helped us manage complexity.

Progressive Disclosure is the practice of only showing features that are immediately relevant to the user. Everything else is either hidden from view, or relegated to a different screen. They are only disclosed when the user requests for them.

It’s great for dealing with those features that are essential but are rarely used. It’s also useful when you need to support power users who require complex functionality.

While our product had many options, we knew that they were not all needed at the exact same time. Progressive Disclosure allowed us to show those features only at the time of their need.

This helped us significantly de-clutter our interface and reduce cognitive overload.

The best example of this was our imaging mode for when eye drops were used. We knew from speaking with our users that eye-drops were only used for a small number of their patients. We moved this setting to a separate page, knowing that users could access it only when they needed to.

Seems like Apple loves Progressive Disclosure (WWDC 2017)

Takeaway 3: Think beyond screens and wireframes

There was still one last hurdle remaining. Our users did not know what a retina looked like, let alone how to position our device to image one.

For patient comfort, our device used very dim infrared light. Imaging the retina under those conditions did not make the process easier.

“Dude, this is giving me anxiety”, I remember Abhinav saying. He was trying to capture a photo of my retina.

“I’m not sure if I can clearly see the retina, and I have no idea if it’s in focus. How do you know when to click?”

I could empathise with his experience. I was barely proficient with the device myself. And this was a key reason why our device sometimes took a few days of training.

No amount of being bold or UI simplification could solve this. This meant thinking beyond screens or wireframes.

I’m not lying — This is the optimal capture position

The problem redefined, we needed two things:

  1. A method to improve real-time visibility under low light
  2. A method to automate the process of capturing an image.

Successfully solving this meant that our users didn’t need to know or care about the ideal position for capture. They simply had to point, and the device would take care of the rest.

Fortunately, this was in our core expertise. It was the one place we didn’t need any help.

We had previously built image processing algorithms that had outperformed Google. We knew exactly how to improve retina visibility, and automate the capture process. It wasn’t long before we had visibility enhancement and auto-capture algorithms ready.

I’m going to save the technical details for the sake of brevity. But the main point I want to highlight is that:

It was only when we broadened our perspective of design, that we solved our problem.

While working on a re-design, it’s easy to get carried away with focusing on just the UI. We needed to remind ourselves that design was about effective problem solving. And that meant sometimes having your AI research team be the designers.

This was a lesson I had previously learned the hard way. This is also why some “poorly designed” products like Craigslist are extremely popular with their users.

It was when we designed beyond just screens or wireframes that we solved our user’s problem.

The auto-capture function in the final product.

The before and after

L — Before, R — After. The right has 60% fewer options than the left.

And if you’re wondering whether it worked — Yesterday, I trained a customer to capture nearly flawless images with our device.

It was a 30 minute Zoom call.

Thanks for making it to the end! If you enjoyed reading this, hit that 👏 button 50 times, and share it with your friends!

If you have any questions or would like to know the next time I post or publish something, hit me up on Twitter.

Thanks to everyone that helped with the proof-reading.

And shout outs to Vasu, Rachit, Abhinav, Florian, Chiran, Aniket, and the Apple Accelerator team for all the work that went towards this redesign.


How Apple helped us prevent blindness through better design was originally published in UX Planet on Medium, where people are continuing the conversation by highlighting and responding to this story.

Источник: UX Planet

Наш сайт является информационным посредником. Сообщить о нарушении авторских прав.

healthcare product-management design ux