Emotional AI: Do we need ethical guidelines?

Andrew McStay, Bangor University's School of Creative Studies and Media

Andrew McStay, Bangor University’s School of Creative Studies and Media

Emotions matter. Joy, fear, surprise, anger, desire, anxiety, calm and other emotional states colour our world and profoundly alter behaviour. I ask, is it OK to use technology to understand people’s emotions, what ends are acceptable and do we need ethical guidelines?

For the uninitiated, two questions might come to mind: first, “Isn’t this a bit creepy?” and then, “How does this work anyhow?”

On how

It can be done by tracking activity on social media, the tone of our voices and physiology; facial expressions are key (try a big smile and feel how the whole face changes shape – lifting your mood a little!).

Other physiological measures include heart rate, electrical activity in the brain, breathing and skin (think ‘Meet the Parents’ and the scene with the lie detector test). This might sound far-fetched but it is routine for audience research companies such as Millward Brown to use technology and neuroscience to analyse bodies and brains for reactions to advertising.

What is novel is what is taking place outside of the lab, especially the tracking of emotions by sensors in our devices and environments. Affectiva, for example, likes to speak not just of an Internet of Things (IoT), but a world where devices armed with emotion chips respond to us like empathetic friends. The big picture is one where artificial emotional intelligence permeates homes, devices and media. This makes human-technical interactions more natural, but privacy questions should rightly be appearing in your mind about now.

There is scope for abuse, but is there anything innately wrong with emotion detection? Gaming, for example, makes excellent use of facial expressions and other soft biometrics to enhance in-world game play (see Nevermind).

Likewise, Sensum, in addition to their marketing work, have used biometric data about emotions to create interactive horror films. Both use full-blown opt-in consent (and nothing personal is saved or stored). There are also significant applications of this technology to promote psychological wellbeing and learning.

Boundaries are now being pushed and use of emotion sensitive technology is happening in public spaces, especially in the UK. Retail is taking an interest and Adland is making creative uses of it. In 2015, for example, M&C Saatchi partnered with Clear Channel and Posterscope to produce a landmark campaign that evolves unique ads based on people’s facial reactions to them. Likewise, Mindshare and Kinetic made use of emotion sensing technologies at the 2015 Wimbledon tennis championship by tracking the crowd’s faces, voices and even heart rates (via wrist-based wearables) to enhance Jaguar’s out-of-home and online presence. Similarly, Mindshare also worked with Sensum and Unilever to create an interactive race where runners were equipped with sensors to track and live-share their energy and emotion.

The net result is this: automated tracking of emotional behaviour is definitely ‘a thing’ and it is growing.

Is it creepy?

To an extent this is a matter of perspective. My own research says that older people are not keen on emotion tracking in any form but younger people are less uncomfortable. However, both groups (and mid-generations) are really not happy if data about emotions were linked with personally identifiable information. This is important.

Is it ethical? Well, that depends. In my view, it comes down to the old adage ‘Don’t take without asking’. What is clear is that working with emotions is sensitive and even if data is not technically personal, it is intimate.  Conversations about the ethics of emotion detection have been non-existent, so yes, if this sector is to have healthy growth and gain people’s trust, we do need ethical guidelines.

The next question is: what might these look like?

This is where you come in.

To learn more about, explore, network, debate and contribute to the ethical guidelines for this sector, you are invited to an upcoming workshop with Dr. Andrew McStay and co-chair Gawain Morrison, CEO and co-founder of emotional marketing and technology consultancy Sensum. This will take place at London’s Digital Catapult on Friday 16 September (1:30-5pm) and, although chaired externally, this is a PD&TN-supported event.

Register now! Don’t miss the opportunity to help to identify ethical standards that will help to drive the industry.

Andrew McStay is a reader in advertising and digital media. He is also director of the Media and Persuasive Communication network (MPC) at Bangor University, UK. He can be reached at mcstay@bangor.ac.uk or @digi_ad.

Gawain Morrison is CEO and co-founder of Sensum, a leading UK emotional marketing & technology consultancy. He believes passionately in empowering individuals to harness the power of their emotions for their benefit, and that of wider-society. He can be reached at gawain@sensum.co.