Artificial Intelligence is creating fresh legal risks for business as new technology opens the way for inadvertent breaches of copyright.
Belinda Breakspear, Partner in the Digital & Intellectual Property team at McCullough Robertson Lawyers (McR), said agentic AI could generate infringing works without any prompts from the user, creating additional risk.
Agentic AI refers to advanced systems that can autonomously plan, reason, and execute complex, multi-step tasks with minimal human intervention.
“Businesses that develop, deploy, or rely on AI and agentic AI systems may face heightened exposure, against the current copyright landscape in Australia which provides an accessible enforcement pathway for rights-holders,” Belinda said in an article released by McR.
Under the Copyright Act 1968 (Cth) (Copyright Act), copyright infringement occurs when a person or business uses all or a substantial part of a copyright work in a way that infringes the copyright owner’s exclusive rights, and does so without permission or a relevant defence (such as the fair dealing exceptions).
To claim copyright infringement, the claimant must be the author or owner of the work in question, and copyright must also subsist in that work. The infringement also needs to satisfy the following.
- The infringing act has been done in relation to a substantial part of the work.
- When comparing the works, there is “objective similarity”.
- There is a casual connection showing the infringement occurred as a result of copying, whether done deliberately or subconsciously.
There are three main types of copyright infringements.
- Direct infringement – Using all or a substantial part of a work in a way that conflicts with the copyright owner’s exclusive rights
- Indirect infringement – Dealing with an infringing copy without authorisation (for example, importing infringing material into Australia)
- Authorisation infringement – Allowing, encouraging or directing someone else to infringe. Liability sits with the person who facilitated the infringement
Ms Breakspear said there was a “notable lack”” of AI-related copyright infringement cases in Australia.
“Nonetheless, both sides of the AI ecosystem, from providers to deployers, could be exposed,” she said.
“For example, a provider (who builds or supplies the AI system) may face liability if they fail to take reasonable steps to prevent infringing outputs.
“A deployer (who uses or integrates the AI system) may face liability if they prompt, generate, store, or rely on infringing outputs. Both may also be liable where copyrighted material is used to train the AI system.”
Ms Breakspear said that ultimately, liability depended on the conduct of each party and how the AI system stored, processed, or reproduced copyrighted material.
“Historically, however, litigation has been minimally pursued due to cost and complexity. (A) proposed small-claims pathway aims to make it easier for rights holders to enforce their rights and address lower value infringement matters; meaning businesses using or supplying AI are more likely to be pursued in the future,” she said.
Serious or commercial-scale infringement can amount to a criminal offence.
This can include making, importing, distributing or possessing infringing AI-generated copies for commercial advantage, or using AI systems to produce or disseminate those copies.
The following are McR’s tips for businesses to manage the AI risks.
- Audit your AI tools. Check what your AI systems actually do. Can they store training data, reproduce copyrighted content, or generate material that looks like someone else’s work? Identify where infringement could realistically arise.
- Check who wears the risk. Review your contracts with AI providers. Confirm who is responsible if the system outputs infringing material, and whether you have indemnities or warranties covering training data, outputs, and misuse.
- Use licensed content where needed. If your AI systems (or agentic AI workflows) rely on third-party content, get permission or licences rather than assuming the AI systems’ use is covered.
- Adopt internal rules for how staff should use AI tools. Monitor what training data is fed into your systems and spot-check outputs for potential infringement risk.
- Train your staff. Give employees simple guidance (don’t ask AI to “replicate” existing works, don’t upload copyrighted material without permission, and report any suspicious outputs).
- Keep an eye on reform. With the text and data mining exemption ruled out, the Australian Government has indicated copyright rules in Australia may tighten further. Track updates so your policies and contracts stay up-to-date.
You can contact Belinda Breakspear here.
The full McR article is here.