News Creative Industries

Using Performance to Demystify Emotional Recognition Technology

In this article

Beverley Hood and her team are using creative methods to support a better understanding of AI and emotional recognition technology within the public.

At EFI, we aim to address the challenges and opportunities posed by data driven innovation in the Arts, Humanities and Social Sciences. We support cross disciplinary research and approaches that produce outputs with meaningful impact. It’s All About the Feelings…, is a creative project which uses performance to engage audiences in conversations about artificial intelligence (AI) and emotional recognition technology.

What is emotional recognition technology?

Emotional recognition technology is an emerging type of artificial intelligence which tries to identify human emotional states. By tracking subtle changes, such as facial muscle movements and variation in speech, the technology attempts to predict the emotions a user is feeling. In facial expression analysis, different combinations of micro facial movements (for example eyes widening or nose wrinkling) are mapped, combined and categorised into the emotion they supposedly correspond to. For example, the Affectiva database has been trained on millions of people’s faces and categorises emotions according to a system developed by psychologist Paul Ekman’s (1978) which proposed a set of universal or basic emotions (fear, anger, joy, sadness, disgust, contempt, and surprise). Users’ facial expressions are interpreted by emotional recognition systems, using such databases, to try to identify the corresponding emotion.

Emotional recognition technology is a growing industry, and it is being researched for use in various aspects of our lives. For example, by 2024 all new cars in the EU will have an inbuilt emotion recognition feature which will be used to monitor the driver’s emotions. Emotional recognition technology is also being researched for use in other settings, such as at border control and in schools. For example, the 4 Little Trees software was used on school children in Hong Kong, in an attempt to interpret how they learned remotely in 2020, during the pandemic.

The controversy surrounding emotional recognition

The science behind emotional recognition technology it is not without controversy. There are concerns about how it is used in examples such as the above, in schooling, where there are questions not only regarding the reliability of the data, but also the ethics of informed consent when monitoring children. Other potential use scenarios include border control, where it could potentially exacerbate racial profiling, and discrimination.

Furthermore, the databases which emotional recognition systems rely on have their own in-built biases and limitations. The way emotions are expressed varies based on social and cultural context and Eckman’s notion of universal basic emotions is not agreed upon. This means that emotional recognition technology might not actually be able to identify different emotions expressed across different cultural cohorts or within varying social contexts. Also, the majority of emotion recognition technologies being applied in real world scenarios rely on using a camera to capture users’ facial expressions. This is a quick and readily available method of capture; however, this too has its limitations. For example, researchers such as psychologist Lisa Feldman Barratt argues that emotions are embodied, being expressed as responses in the body as well as on the face. This limits the reliability of the facial mapping approach and affects the technology’s ability to read and detect emotional states.

These limitations mean that emotional recognition technology might not be as accurate as it claims to be. Although these issues are well known within the AI and ethics research community, they are often less scrutinised in the tech industry and are less known about by the general public.

It’s All About the Feelings…

Beverley Hood and her team are using creative methods to support a better understanding of AI and emotional recognition technology within the public. To do this, they have developed a performance aiming to provoke critical thinking about how the technology might work in practice and about its implications for our near future lives.

The performance by Beverley Hood, features actor Pauline Goldsmith and PhD candidate in Science, Technology and Innovation Studies, Benedetta Catanzariti. Audiences watch as Pauline interacts with the AI, using various acting methods, including improvisation and melodrama, to play with the technology, challenging it to identify which emotions she is expressing, and manipulating her face to try to construct the basic emotional expressions. Audiences are encouraged to scrutinise how accurate the technology is at identifying Pauline’s emotions and are invited to share their observations.

At the pilot performance on 3rd June 2022, audiences observed how the technology often failed to identify which emotion Pauline was expressing. For example, identifying anger when Pauline was trying to express joy. Audiences also saw that when Pauline was moving, talking, or expressing emotion more subtly, for example performing a range of smiles, from happy, passive aggressive, to her ‘zoom smile’, the technology struggled to give an accurate, meaningful, or nuanced reading.

Why does it matter?

The performance aims to provoke thought about the complexity of human emotions and the challenges and dangers associated with attempting to use technology to interpret them. Beverley intends the performance to spark conversations about the real-world uses of this type of this technology and how it might affect our behaviour in the future.

Using performance as a creative methodology to bridge the gap between research and real-world application helps demystify a technology which many people may have concerns about. It helps empower the public to understand the technology better and scrutinise and question the ways in which it is implemented in their lives. This can also create feedback for developers, prompting them to design technology which aligns with users’ needs, concerns and that are attentive to fairness, representation, and bias.

What’s next?

Once finalised, Beverley Hood and her team hope to take It’s All About the Feelings… on tour to reach a wide range of audiences.

Researcher profile

Beverley Hood is a Reader in Technological Embodiment and Creative Practice and an artist. She is leading on It’s All About the Feelings…, a creative project which uses performance to engage audiences in conversations about artificial intelligence (AI) and emotion recognition technology. The project is part of the The New Real programme on Experiential AI, with additional support from EFI, TramwaySupports, Edinburgh College of Art, the Centre for Data, Culture & Society, Creative Informatics and Cove Park.

Join us to challenge, create, and make change happen.

#ChallengeCreateChange