Before Steve Jobs revealed the original iPhone back in the summer of 2007, no one considered interacting with a phone without buttons. Now we don’t think twice about it.
Within the UX world, more and more technologies are emerging that measure users’ physiology to gain a deeper understanding of their thoughts and emotions. These include measuring heart rates, sweat gland activity and even brainwaves. One example is facial recognition software. By people in the know it is described as the ‘systematic analysis of facial expressions’. To everyone else it means using a computer to understand someone’s emotions by reading their expressions.
Topics: Usability Testing