Stay connected.   Subscribe  to our newsletter
Advertisement
Law

FWC’s guidance on AI use a pragmatic step

By Carlos Tse | March 30, 2026|8 minute read
Fwc S Guidance On Ai Use A Pragmatic Step

Given that applications to the Fair Work Commission have increased by over 70 per cent over three years, its draft guidance notes on how it will tackle increasing workloads amid growing GenAI use appreciate the need for a practical approach, one lawyer says.

The use of AI for Fair Work Commission applications has surged, and as a result, the Fair Work Commission has proposed new requirements to clamp down on unmeritorious claims.

The commission found that by the end of the financial year 2025–26, its workload had likely increased by over 70 per cent in the span of three years, which it attributed to the increasing use of GenAI tools by potential litigants.

 
 

On Tuesday (24 March), Fair Work Commission president Adam Hatcher (pictured) announced the release of draft Guidance Notes: Use of Generative Artificial Intelligence in Commission cases, which set out three new requirements that will apply to the use of GenAI in creating unfair dismissal and general protections applications to be lodged with the commission.

The three requirements enforce the declaration of GenAI use in Fair Work Commission applications and introduce legal checking requirements and the affirmation of witnesses regarding their witness statements.

Further, the commission will require legal practitioners or paid agents preparing an application for a client to include hyperlinks to all case law referred to in the document.

These requirements will be given effect in the Fair Work Commission Rules 2024. Further, new ‘Use of GenAI’ sections will be added to all commission forms, as well as the commission’s outline of submissions and witness statement template documents.

“The requirement to independently check and verify the information generated by AI is critical. We all know GenAI has a tendency to hallucinate, and many practitioners will have stories of authorities that don’t exist cited in submissions where GenAI has played a hand in drafting,” Kingston Reid partner Emily Baxter said.

Baxter noted that the commission’s draft guidance notes acknowledged that GenAI will continue to be used in applications.

“Rather than seeking to (perhaps impossibly) limit or ban that use, the commission provides parameters around that use and disclosure, which will only help with the efficient conduct of any litigation in the Commission,” Baxter said.

“This seems to be a more practical way of dealing with the use of the rapidly expanding technology. Although it’s not going to reduce the influx of claims being experienced by the Commission recently.”

In its draft guidance, the commission recommended that potential litigants not include personal or confidential information in their GenAI prompts, highlighting the risk of breaching suppression orders under the Fair Work Act.

“GenAI collects and retains user data, including conversation history and prompts (short instructions) given to it. Even if conversations are deleted, information given to GenAI may be stored in the training database. This can include documents or photos uploaded to GenAI,” the draft guidance notes said.

AI-fuelled unrealistic optimism

The commission said that GenAI tools can give potential litigants unrealistically optimistic predictions of their prospects of success and likely compensation.

“Particularly where litigants are self-represented, it is helpful to understand whether GenAI has been used to draft applications, evidence or submissions, as this can assist in understanding the context of those documents and give better insight into a litigant’s personal understanding of legal principles that may influence the way practitioners, or the Commission, communicate with the individual,” Baxter said.

The commission warned that the material generated by the tools can be inaccurate, incomplete, out of date, or just made up.

“GenAI does not understand unique fact situations, cultural and emotional factors, or the broader social and legal context of a particular case. The source information that GenAI interprets may not be reliable, and the output may be biased,” the commission said in its guidance notes.

Baxter recalled testing a GenAI tool, which pointed to a number of Fair Work Commission cases to support the answer it gave.

“I asked some further questions to verify those cases, and it provided me with citations and the name of the deciding member. Still not satisfied, I asked again, and it provided me with hyperlinks to the cases on a well-known online legal case database,” Baxter said.

“I clicked the links to go to the source material, only to find that the links (and cases) were hallucinated! The GenAI tool doubled down on its hallucinations a number of times before finally admitting that it was making up the answer.”

These requirements will allow the commission to give less weight to, disregard, dismiss applications, or create an order for costs against applicants who do not comply.

The commission invites comment regarding potential requirements for all applicants using GenAI to include hyperlinks to any case law referred to in their application.

“Including hyperlinks to case law can assist the person using GenAI, other parties and the Commission, to confirm the case law exists and to obtain access to it,” the commission said in a statement.

“At present, this requirement is confined to legal practitioners and paid agents, because of a concern that unrepresented individuals may have difficulty including hyperlinks in documents.”

Comments can be submitted to FWC by 10 April 2026.

Carlos Tse

Carlos Tse

Carlos Tse is a graduate journalist writing for Accountants Daily, HR Leader, Lawyers Weekly.

HR LeaderWant to see more stories from trusted news sources?
Make HR Leader a preferred news source on Google.