Touchless gesture interaction

Posted by Dr Harry George on Aug 9, 2019 2:04:16 PM

Before Steve Jobs revealed the original iPhone back in the summer of 2007, no one considered interacting with a phone without buttons. Now we don’t think twice about it.

Roll on a few years and we’ve moved passed button-less phones and into the realms of Siri, Alexa and Cortana. While voice assistants still have a way to go (coping with accents and unusual pronunciations causes all sorts of hurdles for the AI underpinning the technology) there is little doubt that the potential of voice recognition is there.

These forms of interaction are known as natural user interfaces (NUIs) because the way we interact with them is consistent with our ‘natural’ behavior.  One of the new and exciting types of NUIs is Touchless Gesture Interaction.

With hand gestures alone and without actually touching anything, users are able to control their devices. For example, drivers of a BMW 7 series can turn their speaker volume up or down simply by twiddling their finger!  

Google are taking touchless gesture interaction to the next level with their Project Soli, a purpose-built interaction sensor using radar to track motion from hand movements. Sounds complicated but here are a few examples of what it might allow you to do:

  • Selecting a button or click a link just by pressing your forefinger and thumb together.
  • Scrolling a page or moving a slider by just moving your thumb back and forth over your forefinger.

It looks hugely impressive and could well be the future of human-computer interaction. Take a look...

As always, when designing these innovative methods, the user should be at the forefront.

The good thing about NUIs is they make use of natural behaviours, making them inherently user friendly. Bill Buxton, a Microsoft researcher says, that NUI’s “exploit skills that we have acquired through a lifetime of living in the world, which minimizes the cognitive load and therefore minimizes the distraction”.

Joshua Blake, an expert in the NUI field and author of ‘Natural User Interfaces in .NET’, expands on this and outlines four design guidelines that should be considered to design effective NUIs:

  • Instant expertise – Taking advantage of the users’ existing skills to save them the trouble of learning something completely new e.g. talking or swiping
  • Progressive learning – Allow novices to learn through a clear step by step path whilst allowing advanced users to use the skills they already have to prevent frustration
  • Direct interaction – NUIs should imitate the users’ interaction with the physical world. Like the iPad, where our actions feel correlated to what happens on the screen.
  • Cognitive load – A good interaction will limit the cognitive load so the user can focus on achieving a task, therefore ideally the NUI would use basic knowledge and simple skills

Touchless gesture interaction is an exciting new step for the NUI world and one that is already on our door step. Soon we will simply be waving at our wearables and snapping our fingers at our smart devices… a bit like magic!

References

Related content: Facial recognition in usability testing, How can you use a chatbot?, What is the purpose of usability testing?

Topics: Usability Testing

Want to know more?

If you’d like to talk to someone about how you can optimise your digital media with user research and advice, please get in touch!

We would love to hear from you.

Contact us

Subscribe Here!

Recent Posts