Powered by MOMENTUM MEDIA
HR Leader logo
Stay connected.   Subscribe  to our newsletter
Tech

Protecting privacy in an AI-enabled HR future

By Nick Wilson | |6 minute read

Should employers worry about breaching privacy laws by bringing AI into their HR functions?

HR professionals are showing their willingness to modernise, and HR has as much to gain in early adoption as any other industry. Indeed, research suggests that the majority of HR leaders are already using artificial intelligence (AI) across various HR functions, including:

  1. Employee records management (78 per cent)
  2. Payroll processing and benefits administration (77 per cent)
  3. Recruitment and hiring (73 per cent)
  4. Performance management (72 per cent)
  5. Onboarding new employees (69 per cent)

More importantly, 92 per cent of HR leaders are planning on increasing their use of AI in at least one area of HR in the coming years. AI has obvious HR applications through things like payroll and chatbot functions; however, it is being used in more comprehensive, sophisticated ways.

Advertisement
Advertisement

For instance, the growing phenomenon of “stack ranking” employs a statistical approach to comparing employee performance relative to one another. After analysing and comparing performance, AI software will recommend certain measures, such as additional training or managerial intervention.

While layoffs, both real and perceived, continue to make the headlines, comparatively less attention has been given to the many privacy concerns related to bringing AI into HR. Today, we’re asking how business leaders can ensure employee privacy can be honoured while rolling AI out across the HR function.

“The critical phrase is trust,” said Charles H Ferguson, Asia-Pacific general manager at AI-empowered, global employment pioneer Globalization Partners. “[In HR], AI is only as good as its ability to deliver on trust and the expectations around data protection.”

“For the purveyor, the question is how do I ensure that I’m providing a transparent and accessible view of the data that I’m using and how I’m utilising it,” said Mr Ferguson.

Workplace privacy is protected by law, and severe penalties often apply for non-compliance. Many details of employment are private in nature, and employers are often legally required to keep certain personal information about employees in their records.

This can include emergency contact details, wage or salary details, leave balances, records of work hours, taxation, banking or superannuation details.

As noted by Mr Ferguson, emergent tech has posed a threat to employee privacy: “These large tech companies know so much about us that privacy is sort of an evolving definition. I don’t know what’s private anymore, quite frankly.”

“But there’s a very human and very real desire to protect what’s private to us, and this should be a first and foremost option [in bringing AI into HR],” he added.

While many countries have made regulations for data privacy, according to Mr Ferguson, the General Data Protection Regulation, passed by the European Union, is the benchmark. Regulation is one part of the equation; another is the responsible, transparent use of AI and policy on the part of employers.

“The best purveyors of this kind of generative AI technologies are deploying anonymisation techniques that very clearly separate the individuals from the data and the data outcomes,” said Mr Ferguson.

“The concern should be: how are you using my data? And what data are you using? And if a company doesn’t have a prepared, authentic, transparent methodology to share that information, then you should be concerned.”

In protecting the individual employee’s right to privacy, companies should offer the ability to opt out of AI data use, said Mr Ferguson. On the other hand, however, he stressed the importance of being realistic as an employee when it comes to knowing what is genuinely personal and what is sufficiently anonymised.

“Let’s take salary data, for example. If I take your salary data and I put it into a pool and it’s associated with a particular role, and that’s put into a market analysis, and it’s anonymised and it comes out with a range between X and Y based on role A and B, this will have no demonstrable impact on your life,” said Mr Ferguson. “But if I say ‘Jim in accounting makes more than Sue in accounting’, that can be a problem, right?”

At the core, Mr Ferguson said, AI is not a spirit in the sky. It is a tool to be used transparently and responsibly to supplement – not substitute – human operations.

“It is only as good as the data you feed it. So, if you’re feeding it pre-anonymised, well-designed and scrubbed data, then the interaction you have with it will not infringe anyone’s privacy. The people who are deploying the tool need to have the right intentions and the right design, which only comes through experience,” he concluded.

RELATED TERMS

Onboarding

Onboarding is the process of integrating new hires into the company, guiding them through the offer and acceptance stages, induction, and activities including payroll, tax and superannuation compliance, as well as other basic training. Companies with efficient onboarding processes benefit from new workers integrating seamlessly into the workforce and spending less time on administrative tasks.

Nick Wilson

Nick Wilson

Nick Wilson is a journalist with HR Leader. With a background in environmental law and communications consultancy, Nick has a passion for language and fact-driven storytelling.