Why Touchscreens Suck

Touchscreens suck.

Allow me to elaborate. Touchscreens are omnipresent today, but not always for the right reasons. Often times the reason is — and I can't express it any other way — laziness. An easy way out to save cost and time and a lack of commitment to the design and especially the user experience.

When designing a hardware product, one of the most difficult things is to decide how your product will function and how people will use it and interact with it. The reason is that, in hardware, you can't change your mind once it hits the market. You have to figure it out beforehand where the buttons need to go, how the lights will light up, how hard a dial needs to be turned and which tactile feedback is appropriate for each point of interaction. If you get it wrong (or if you've forgotten something) there's no way to change it back. No over-the air updates possible for electro-mechanics I'm afraid.

Our software counterparts have it much easier of course, if you're developing an app or a SaaS you can mock it up quickly in Figma or Adobe XD and throw it in a user's hands and crank out dozens of iterations in a day until it works smoothly.

User Satisfaction & Retention

Often a lot of focus is put on UX/UI in software, and rightfully so. Studies have shown that the amount of friction a user experiences is the best indicator for customer satisfaction. With lower amounts of friction being better of course.

You might have heard of an NPS score (the classic "how likely would you be to recommend us?") that's used to measure customer satisfaction. There's actually another indicator called the CES (Customer Effort Score) that shows to outperform NPS in predicting customer retention and is the strongest driver of customer loyalty (according to a 2010 Harvard study)

Credit: Harvard Business Review

Just like user research, I've seen a lot of hardware entrepreneurs (and big corporates) completely ignore this, and choose not to invest in UX/UI (or the more classical "usability", which I prefer as a term for a number of reasons). In order to avoid (or let's say ignore) the challenges this presents in hardware, companies often decide to jam a touchscreen in their products and "figure it out later" or "fix it in software".

A touchscreen allows you to change your interaction model, number of buttons and settings significantly after the fact (given it's an IoT device of course). However, more often than not, this results in a poorer user experience and a weaker product because it was implemented as an afterthought.

Often it's also a cost saving measure: less physical buttons = lower cost. But then you end up in a car where you have to navigate 5 menu's, before you get to roll the windows down or turn your system up (for those who got that reference, my hart bleeds for you).

A case for safety

We remain humans with the incredible (underestimated) power of tactile touch as an input and interpretation method. We have hands that can't always be in our field of vision when using something. We should be able to operate the core functionalities of products almost blindly through touch, feel, light and sound, without taking our eyes off what's really important.

It's come to a point that, especially in automotive, this has been recognized by the NCAP safety board as they're now including a metric for the number of physical interaction points a car has for frequent basic functions to decide how safe a car is. Speaking of safety, this has been common practice in medical devices for at least 10 years now, since they're forcing "usability engineering" through stricter and stricter regulation.

Usability in general makes a lot of sense as related to safety since the discipline saw the light of day because of the WW2 bomber, the Boeing B-17, kept crashing. After researching the issue, it turned out a lot of the controls looked and felt too similar and pilots did a lot of operations blindly while keeping an eye on the sky.

Especially the landing gear and landing flaps had very similar switches, resulting in pilots inadvertently retracting their landing gear when landing, resulting in crashes. This resulted in the birth of usability engineering as we know it today, where shape coding became the norm.

A crashed Boeing B-17 sitting on the tarmac

A crashed Boeing B-17 sitting on the tarmac

Rich Interaction

But safety aside, another thing that's bugging me about this is that each day we are straying away further form "rich interaction models": the dials and levers we all love on old stereo systems, the mechanical switches on our kitchen appliances, … all these very satisfying and fun experiences that add to the brand perception and user experience of a product, all slowly being traded away until every product is a box with a screen. As an industrial designer by training, it's really painful to watch as I really appreciate the "fun" factor of using a good hardware product.

Even recently I was watching old and new episodes from Pokémon with my 5-year old (and I was quite the Pokéhead in my days when the original games and show launched). It struck me how much more interesting and fun the old Pokédex looked as compared to the new one (which is now called a Rotom Phone I guess?).

Just awesome clicky-touchy, chunky goodness!

A boring, gross, floating smartphone.

And those feelings were always there, I guess. When we started to slowly move towards digital focused devices, I became inspired by the work of Joep Frens on rich interaction and as a result I place a high value on it and I can suggest anyone to check out his work.

Rich interaction camera by Joep Frens

Now what?

Luckily we're seeing a countermovement, with the TP7 field recorder being a beautiful example of which I hope to see many more. Even in the Cupple ice cream machine on which I worked, we've fought tooth and nail for a screen-free experience, and that thing is hella fun to use!

Teenage Engineering’s TP7 Field Recorder

So. Touchscreens bad? Not at all, they have their rightful place in multimodal products, where one device is supposed to do many things and your eyes are okay being where your hands are, such as smartphones, tablets and the likes. Then it makes perfect sense to have an ever-adaptive input method.

I also fully recognize the fact that "real estate" on products is shrinking and clever combinations of touch screens (or surfaces) with more traditional interaction methods can give you best of both worlds (or worst if you've ever owned a 2017 NVIDIA Shield remote).

Just don't chicken out, do the work and create an iconic rich interaction product that people will love for generations.

Previous
Previous

Think Small: The Power of a Smallest Viable Audience

Next
Next

What Can Designers Learn From Marketers?