Powered by MOMENTUM MEDIA
lawyers weekly logo
Stay connected.   Subscribe  to our newsletter
Advertisement
Business

AI surveillance in the workplace: A good or bad idea?

By Carlos Tse | |7 minute read
Ai Surveillance In The Workplace A Good Or Bad Idea

Here, one expert reflects on the use of AI-powered surveillance by HR leaders, its benefits and issues, and how it can be approached safely to make possible what was once too labour-intensive to achieve.

Leading AI expert, Professor Jian Yang of Macquarie University's Data Horizons Research Centre, recently shared her insights on the use of AI surveillance, saying: “AI can protect privacy or destroy it.”

Yang described AI surveillance as a double-edged sword. She said: “AI education is vital, both in realising its potential for good and the ethical implications of its use.”

 
 

Along with the educational aspect, Yang called for the importance of regulations to curb the negative impacts that these technologies can have on an individual’s privacy.

“The data that could be defined or inferred by AI-powered surveillance is almost limitless, making strong and specific legislation essential,” she said.

Our data, our choice

When used during work-from-home (WFH) days, Yang said, the implications of these technologies can go as far as profiling family members and collecting aggregated data to sell.

She said: “Employees’ data holds value. For example, it’s possible that employers could scrape surveillance data at scale and create aggregated data sets on everything, from when employees eat lunch to when they argue with family members.”

Speaking about the use of AI to analyse devices that capture 24/7 audio, Yang warned: “Even with limited audio range, companies can detect far more than just work-related conversations. Using AI, they can infer household routines, emotional states, and even family dynamics.”

AI is indiscriminate in its analysis, she found – when used in conjunction with these audio devices, “it’s not just listening for work; it’s listening for patterns, tone, and context. That’s a powerful tool, but also a dangerous one if misused.”

She said: “The sound of a child playing or a partner speaking in the background can reveal age, gender, and daily schedules, which could be used to build a profile of the employee’s family, without their knowledge or consent.”

“These data sets can then be used directly or sold to third parties without any benefits to employees.”

With powerful analytical capabilities comes great responsibility

Yang remarked that AI has a multitude of benefits for care and wellbeing in the workplace, especially in the aged care sector.

“In aged care, there is the potential to use sensors to alert carers to trips and falls, or to flag unusual patterns, like if someone is in bed or the bathroom for a long time and may need help,” she said.

With employees’ knowledge, AI surveillance allows HR leaders to better support staff, Yang said, for team members who may be working excessive hours or not taking a break.

She said: “Current technology is making possible what was once too labour-intensive to achieve. Organisations don’t have time to trawl through the volume of data captured by surveillance, but AI can review multiple sources and create alerts and trend analyses in real time.”

The ethical panopticon

Yang urged HR leaders who utilise AI surveillance to be driven by purpose, not curiosity. She said: “Businesses must ask themselves: do they need to gather data on when someone eats lunch, or what music they play while working?”

Ethical data management can be enforced by AI, she recognised, where only the data that is required is extracted and monitored.

“We can already monitor keystrokes and other indicators to see when people are working. There are also options like sensors that collect limited data that [don’t] infringe on privacy rights,” she said.

Yang said that with proper AI education, these technologies can de-identify and aggregate data to help people live independently.

“Aggregated data can be valuable, but individual privacy must never be compromised,” Yang said.