08 August 2025

It’s not business as usual: AI, consent, and the future of data privacy in Australia

Mountain reflecting water

In summary

  • Australia’s privacy framework is set for major reform, with a proposed dual-track model giving businesses the option to prove “best interest” data use instead of relying solely on strict consent.
  • This move puts pressure on advertisers to justify their data practices, clean up tracking infrastructure, and adopt privacy-enhancing tools like decentralised clean rooms.
  • As the compliance bar rises, advertisers should be leading with transparency, building internal data literacy, and rethinking personalisation strategies to stay competitive and legal.

New privacy rules, AI risk, and agency accountability

Just hours before the IAB Data & Privacy Summit kicked off, the Productivity Commission released its Interim Report: Harnessing Data and Digital Technology to Improve Productivity, a proposal that could fundamentally reshape how data is governed in Australia.

As Peter Leonard, principal and director, data synergies, put it, “The past is not a guide to the future.” What followed was a pointed discussion on AI, consent, clean rooms, and the shifting definition of “fair and reasonable” data use.

Dual-track regulation is coming…and it changes everything

The Productivity Commission’s bombshell proposal? A dual-track privacy regime. As Leonard outlined at the event, organisations may soon get to choose between:

  • A stricter consent-based regime (tranche 1.5 - a scaled-down second stage of Privacy Act reforms), or
  • An outcomes-based alternative, where you justify your data use as being in the best interests of the individual.

This second option is being described as a safe harbour, a regulatory pathway where, if you can prove your practices meet certain outcome-based standards, you avoid some of the more prescriptive consent requirements.

For digital advertisers, that sounds promising. But Leonard cautioned, “One thing to pause and think about is how the tranche 1.5 track might develop if in fact that alternative is made available. And there is always a risk that when the government allows a safety valve or a safe harbour that they kind of double down on the other side and make the other side requirements even more onerous because an organisation has the choice of the safe harbour.”

He also warned that success under either track won’t come from last-minute compliance scrambling, it will require building data literacy across the organisation. That means agencies and businesses need to train internal teams on how data is collected, used, and governed, rather than outsourcing every decision. And it means lawyers themselves need deeper technical understanding of the adtech and martech systems their advice covers, something they won’t get without education from their clients and agency partners.

What’s more, the Commission is actively examining whether global AI players should be allowed to mine copyrighted content without consent under this same safe harbour logic, raising major questions about data value, ownership, and control in AI development.

The OAIC’s TikTok pixel investigation has serious flow-on effects. It wasn’t just TikTok under the microscope, it was any business that allows third-party trackers on its site.

“You can’t say, ‘we didn’t know’ anymore,” said Leonard. “That’s not a defence.”

The ruling was, in Leonard’s words, a shot across the bow for website operators. Even if you’re not directly collecting or using the data yourself, you’re still responsible for ensuring any pixel activity on your site complies with the Privacy Act.

Nick Hayes, head of digital, The Media Store added that many SMEs don’t even realise pixels are firing, sometimes their tags are managed informally by a friend or relative, and the “fair and reasonable” test remains vague. What used to require a complex technical integration is now often just a checkbox in a platform UI, making it far easier for data to start flowing without proper oversight.

With that risk profile, compliance obligations, and the technology’s ease of deployment, the bar for ignorance has been removed.

Louder’s view: This is the moment to audit your pixels, get clear on your tag governance, and ensure every tracking tool has a documented legal basis.

Clean rooms aren’t automatically compliant

Clean rooms have been hyped as a fix-all. But the panel made it clear: not all are equal.

Some still operate as a black box, so brands don’t know where their data goes, how it’s handled, or who sees it. Others require you to hand over your data to a third-party environment instead of keeping it in your own tech stack.

Adele Burke, sales director ANZ, InfoSum warned: “Just because it’s called a clean room doesn’t mean it’s safe.” She advised brands to check if differential privacy is built into the tech, and whether the solution is independent and agnostic enough to securely connect to partners.

Danny Tyrell, co-founder, DataCo, added: a clean room doesn’t automatically mean compliance, if outputs contain individual identifiers, it’s still personal information under law. Architecture matters. Decentralised models with strong controls can ensure identifiers never leave without permission, but poor implementation can undermine even the best technology.

Leonard agreed, stressing the issue is often less about whether a clean room is “clean” and more about the form of outputs allowed, and the protections on who receives them. Under privacy law, generating an inference about an individual, even from synthetic data, counts as personal information. In an AI-driven world, he added, the fidelity of data will become as important as privacy.

Louder’s view: Choose decentralised clean room solutions with privacy baked in. Ask: who controls the data, what outputs are allowed, and how are inferences handled? Privacy risk sits in the details.

Synthetic data and AI are eroding trust in data fidelity

The more AI trains on synthetic data, the more distorted the outputs become. It’s a compounding risk, and one that’s often overlooked in privacy debates.

“AI is starting to retrain on data created by other AI,” said Tyrell. “What happens if that data was a hallucination?”

Leonard echoed the concern, warning that fidelity, not just identifiability, will be a major challenge in future governance models. It’s no longer just about protecting data, it’s about knowing if your data is even real.

Louder’s view: Strong data governance isn’t just about security or compliance. It’s about reliability. You can’t make sound marketing decisions off bad data.

Agencies can’t afford to sit on the sidelines

Agencies are now in the hot seat. With overlapping roles in media, martech, analytics, and data strategy, clients are looking to agencies to connect the dots. But there’s a line between helping and overstepping.

“If you want to play Deloitte, you need to take on Deloitte-level liability,” said Tyrell.

That means knowing when to consult, when to collaborate, and when to call in legal. Many independents are setting up tags or advising on data collection without fully grasping the risk.

Louder’s view: We lead with clarity. We’re strategic partners in privacy-aware data practice, but we also know where the legal line is. Privacy now sits alongside performance as a core part of our offer.

Louder’s recommendations

  • Get privacy fit, fast: The dual-track model will favour those who can prove consumer benefit. Start preparing now.
  • Audit every pixel: Third-party tech = first-party responsibility.
  • Don’t trust the clean room label: Verify architecture, governance, and output controls.
  • Question your AI data: Synthetic does not equal accuracy. Don’t assume your models are clean.
  • Agencies: own your role, not your client’s liability: Support smartly, stay in scope.

Get in touch

Get in touch with Louder to discuss how we can assist you or your business and sign up to our newsletter to receive the latest industry updates straight in your inbox.



About Emma Shepherd

Emma is the Editorial and Communications Lead at Louder. When she’s not writing or editing, she’s out walking her dog, Bronx, taking a Pilates class, or tracking down the city’s best Sunday roast.