Within the UX world, more and more technologies are emerging that measure users’ physiology to gain a deeper understanding of their thoughts and emotions. These include measuring heart rates, sweat gland activity and even brainwaves. One example is facial recognition software. By people in the know it is described as the ‘systematic analysis of facial expressions’. To everyone else it means using a computer to understand someone’s emotions by reading their expressions.
Why facial recognition?
As humans we are rather good at detecting someone’s emotions simply by looking at their face. (according to the psychologist Paul Ekman, we have six core emotions, although this number is very much debated). As Charles Darwin wrote in his 1872 book: “Facial expressions of emotion are universal, not learned differently in each culture”, suggesting facial recognition technology could understand a user’s emotions regardless of their gender, creed, ethnicity or upbringing. The obvious question is whether current technology has evolved enough to do this as well as a human can, and if so, can it be applied to user research?
How does facial recognition work?
First up, the hardware: nothing special required here, just a simple webcam to film the user. Next, the software: once filming is underway an algorithm is used to capture a 3D mesh of the users’ face in a neutral position to calibrate the technology. The software can then track facial motion at specific spatial points such as the upper lip or corner of the mouth. Changes to these facial features have been correlated to certain emotions. For example, a pulled lip corner demonstrates frustration and a furrowed brow could indicate anger. If your user encounters a pain-point during testing, changes in their facial expressions will be picked up and the software will inform you that the user is experiencing frustration.
When used as part of a moderated usability testing session, you can not only find out when user’s experience certain emotions but also why. As with eyetracking, the facilitator can ask follow up questions in response to spikes on the facial recognition software, which will ensure you are getting the most from your tech. Read more about facilitation techniques.
What at the pros and cons of facial recognition?
Pros - There’s little doubt that facial recognition for usability testing has promise: it’s the least intrusive biophysical measure of emotion, it’s relatively cheap with no expensive hardware and initial tests suggests it is more effective at detecting emotion than other technologies in the market.
Cons - However, it is not a perfect solution. While facial recognition software is fantastic at understanding highly expressive emotions (for example, it will be accurate at detecting your happiness if you’ve won the lottery), the current algorithms used to detect more subtle emotions, such as patience, are less accurate.
Generally speaking, the change in a users’ facial expression as they navigate a website or app are minor. However, studies suggest facial recognition software is a fairly good predictor of both frustration and enjoyment; two key emotions that are important to capture when testing websites or creating user journey maps.
So, on the face of it (excuse the pun), giving facial recognition technology a go for your next piece of user research could add another level of insight to your outcomes.
- Darwin, C. (1872). The Expression of the Emotions in Man and Animals.
- Ekman, P. & Friesen, W. V. (1971). Constants Across Cultures in the Face and Emotion. Journal of Personality and Social Psychology, 17(2) , 124-129.