Before Steve Jobs revealed the original iPhone back in the summer of 2007, no one considered interacting with a phone without buttons. Now we don’t think twice about it.
Roll on a few years and we’ve moved passed button-less phones and into the realms of Siri, Alexa and Cortana. While voice assistants still have a way to go (coping with accents and unusual pronunciations causes all sorts of hurdles for the AI underpinning the technology) there is little doubt that the potential of voice recognition is there.
These forms of interaction are known as natural user interfaces (NUIs) because the way we interact with them is consistent with our ‘natural’ behavior. One of the new and exciting types of NUIs is Touchless Gesture Interaction.
With hand gestures alone and without actually touching anything, users are able to control their devices. For example, drivers of a BMW 7 series can turn their speaker volume up or down simply by twiddling their finger!
Google are taking touchless gesture interaction to the next level with their Project Soli, a purpose-built interaction sensor using radar to track motion from hand movements. Sounds complicated but here are a few examples of what it might allow you to do:
It looks hugely impressive and could well be the future of human-computer interaction. Take a look...
As always, when designing these innovative methods, the user should be at the forefront.
The good thing about NUIs is they make use of natural behaviours, making them inherently user friendly. Bill Buxton, a Microsoft researcher says, that NUI’s “exploit skills that we have acquired through a lifetime of living in the world, which minimizes the cognitive load and therefore minimizes the distraction”.
Joshua Blake, an expert in the NUI field and author of ‘Natural User Interfaces in .NET’, expands on this and outlines four design guidelines that should be considered to design effective NUIs:
Touchless gesture interaction is an exciting new step for the NUI world and one that is already on our door step. Soon we will simply be waving at our wearables and snapping our fingers at our smart devices… a bit like magic!
References
Related content: Facial recognition in usability testing, How can you use a chatbot?, What is the purpose of usability testing?