As user experience professionals, our job is to create products and services that are useful, usable, and enjoyable. Fortunately, there are a host of research methods at our disposal to point us in the right direction in this pursuit. As we are all keenly aware though, using the right research method is often not enough. The perfect research plan won’t mean much if those executing it are not objective; and, as all user experience professionals have experienced at one time or another, it can sometimes be impossible to be objective. This is especially true when it comes to reading human emotion. To make reading human emotion even more difficult, what users say they feel is often not an accurate representation of how they actually feel.
In an effort to help user experience professionals objectively and accurately evaluate human emotion, several companies now offer facial expression recognition software—technology that reads human emotion through facial expressions. Such software works via complicated algorithms that analyze the facial expressions of people, either offline via video, or in real time while a person sits in front of a webcam. Companies that currently offer facial recognition software include, but are not limited to: Noldus (see Figure 1), Affectiva, and ThirdSight. While some companies offer facial expression recognition software via subscription and rental models, others sell software licenses. Prices vary greatly depending on the location and size of the company purchasing the software, and those conducting academic or non-profit research projects are often given discounts.
To date, facial expression recognition applications can detect basic emotions, such as fear, happiness, sadness, surprise, anger, disgust, or neutrality. Several can also classify a person’s head orientation and gaze direction, as well as if a person’s mouth is open or closed, if his eyes are open or shut, and if his eyebrows are raised, neutral, or lowered. Some applications, such as Noldus’ FaceReader, require no upfront calibration or markers, and can analyze multiple videos at once. Several facial expression recognition applications can be integrated into third-party software, and many claim minimal training needed prior to usage.
One of the main benefits that such technology offers to UX professionals is its ability to be easily used in combination with established research methods, such as usability testing, to provide a more detailed view of users’ experiences. For example, when testing the new website of a financial institution, you may find that users are able to use the site (good) but are experiencing surprise in response to certain messaging (bad if you are trying to project an image of security and predictability). Without having detected users’ emotional reactions, the company could have prematurely launched the site, potentially resulting in the loss of a lot of money!
Other benefits of facial expression recognition software include the ability to save time by quickly coding and analyzing video data, the capability to let other applications (such as gaming software) uniquely respond to the emotional state of a user during testing, and the option to detect discrepancies between reported and actual human emotion. Finally, there is the potential to use facial expression recognition technology to create personalized e-commerce and entertainment experiences.
Given that facial expression recognition software is a relatively new technology, there are some limitations to its use. Most applications are sensitive to lighting conditions, face positions, camera distance, and have a hard time reading expressions from those with facial hair or glasses with heavy frames. Additionally, facial expression recognition applications are less accurate at reading anger, disgust, and the expressions of young children. As the technology improves, some of these issues will likely be resolved.
In summary, facial expression recognition software is an emerging research tool that can help UX professionals systematically capture and objectively analyze human emotions. While the tool’s full potential has yet to be completely discovered, I have no doubt the technology will be more widely used in the future. Before this happens though, our community should develop guidelines on how to effectively use facial expressions to augment other usability data and help us make better design and business decisions.The full article is available only in English.The full article is available only in English.Um novo software pode ler as faces para detectar emoções básicas quando alguém usa um site, fornecendo uma visão mais detalhada sobre como reagem.
O artigo completo está disponível somente em inglês.The full article is available only in English.Nuevo software puede leer los rostros para detectar emociones básicas cuando alguien utiliza un sitio web, entregando una visión más detallada de la reacción del usuario.