Powered by MOMENTUM MEDIA
HR Leader logo
Stay connected.   Subscribe  to our newsletter
Tech

Should bosses use AI to track employees’ emotions?

By Kace O'Neill | |5 minute read
Should Bosses Use Ai To Track Employees Emotions

“Big Brother” at work: Have employers gone too far with their new tool? Emotional AI tracking is now being embedded in the workplace, taking away the cherished privacy of one’s own emotions.

Since the COVID-19 era, emotional AI tracking has been embedded in workplaces. Using biological signals picked up by facial expressions, vocal tone, and other reactions, devices can monitor human behaviour and then use AI to make an assumption on their perceived emotional state.

This is not, however, a new phenomenon. Back in 2019, Gartner reported that over 50 per cent of large employers in the US were using emotional AI to keep track of the internal states of their employees. Phrased that way, it reads like a gross and invasive breach of someone’s privacy, yet organisations have claimed that it’s done from the perspective of positively impacting a worker’s wellbeing.

Advertisement
Advertisement

HR Leader reached out to EST10 founder Roxanne Calder – who’s previously been rather frank with her thoughts on wellbeing policies – about what she thought in regards to the implementation of emotional AI tracking in the Australian workplace.

“Emotional AI has the potential to provide actionable insights into employee engagement and mental health. It is possible to identify patterns of stress or disengagement that lead organisations to intervene early and offer support or resources to employees,” said Calder.

“Tools like these can, in theory, democratise access to mental healthcare, providing real-time feedback that would otherwise be difficult to gauge in large teams.

“Yet, I’m not entirely convinced. There is the issue of personal privacy – ‘Why can’t I have a bad day in privacy … without it being brought to the attention of the world?’ With global low productivity levels and reduced engagement levels in many organisations using such AI, emotional tracking could erode trust within organisational cultures.”

Despite the benefits, the implementation of this process is a complex interplay between rapidly advancing technology and the privacy of our emotions.

“Emotional AI presents a complex interplay between technological advancement (necessary) and the preservation of human dignity (vital). Our wellness and wellbeing are, of course, important. Yet, my question as always remains, why is the shift in responsibility from the individual to organisations and with emotional AI tracking, have we gone too far?” said Calder.

“By relying on tech to tell us how we are feeling, will it reduce our ability to recognise and regulate emotions and feelings?

“Worse, for someone to reach out and say, ‘R U OK’ will be determined by AI telling us to – instead of our human emotional senses. Humans need humans, and these precious human skills need practice.”

Based on this, Calder argues how imperative it is that technology works for humans, and not vice versa.

“Technology should serve to enhance human capabilities without undermining fundamental human values. We should advocate for the ethical application of technology, ensuring it supports rather than supplements human connection,” said Calder.

“Within this context, it is clear that emotional AI can offer important insights, [but] it is also clear that it remains only as a tool that cannot replicate the depth of human empathy and the nuances of interpersonal relationships.

“While recognising the potential benefits of technology in monitoring and promoting wellbeing, it is absolutely crucial to approach its implementation with caution. Prioritising transparent communication, robust ethical guidelines, and the irreplaceable value of genuine human interaction is essential to ensure that such tools truly serve the collective wellbeing of employees.”

As time goes on and the technology accelerates its development, a pivotal decision must be made on where to draw the line. This is only one example of AI’s application to the workplace when there are various ways it can and will be implemented. If, however, this implementation exits the ethical boundaries, then the core fabric of the workplace could be torn apart.

“At its core, this technology attempts to quantify something deeply human: our emotions. The question isn’t just whether emotional tracking goes ‘too far’ but whether it shifts the workplace away from fostering genuine wellbeing towards technocratic management of feelings. Please, no!” said Calder.

Kace O'Neill

Kace O'Neill

Kace O'Neill is a Graduate Journalist for HR Leader. Kace studied Media Communications and Maori studies at the University of Otago, he has a passion for sports and storytelling.