Businesses using artificial intelligence (AI) to monitor employees’ behaviour and productivity can expect them to complain more, be less productive and want to quit more, according to new research.
Associate Professor Emily Zitek, from Cornell University in the United States said the research showed the technology needed to be framed as supporting an employee’s development, otherwise its use would backfire.
Associate Professor Zitek said surveillance tools, which were increasingly being used to track and analyze physical activity, facial expressions, vocal tone and verbal and written communication, caused people to feel a greater loss of autonomy than oversight by humans.
“Businesses and other organizations using the fast-changing technologies to evaluate whether people are slacking off, treating customers well or potentially engaging in cheating or other wrongdoing should consider their unintended consequences, which may prompt resistance and hurt performance,” she said.
Associate Professor Zitek said there was an opportunity to win buy-in, if the subjects of surveillance felt the tools were there to assist rather than judge their performance – assessments they fear would lack context and accuracy.
“When artificial intelligence and other advanced technologies are implemented for developmental purposes, people like that they can learn from it and improve their performance.
“The problem occurs when they feel like an evaluation is happening automatically, straight from the data, and they’re not able to contextualize it in any way.”
Associate Professor Zitek said there were recent examples of backlash from this algorithmic surveillance, with an investment bank in 2020 dropping a pilot program testing productivity software to monitor employee activity, including alerting them if they took too many breaks.
“Schools’ monitoring of virtual tests during the pandemic sparked protests and lawsuits, with students saying they feared any movement would be misinterpreted as cheating,” she said.
Associate Professor Zitek said the research found that people were more accepting of behaviour-tracking systems such as smart badges or smartwatches when they provide feedback directly, instead of through someone who might form negative judgments about the data.