12 August 2025
It’s not business as usual: AI, consent, and the future of privacy in Australia
In summary
- Australia’s privacy framework is set for major reform. With the OAIC, first tranche of law reforms, second tranche of law reforms (coming), and new social media laws all in play, businesses face a complex regulatory landscape that requires careful management and navigation.
- In somewhat surprising news to industry observers, the Productivity Commission has proposed a dual-track outcome tested privacy model giving businesses the option to prove “best interest” data use instead of relying solely on strict consent.
- This move puts novel expectations for advertisers to justify their data practices, clean up tracking infrastructure, and adopt privacy-enhancing tools such as decentralised clean rooms.
- As compliance bars rise, advertisers should be leading with informed consent, data-use transparency, building internal data literacy, and rethinking personalisation strategies to stay competitive and legal.
New privacy rules, AI risk, and agency accountability
Australia’s productivity has been in the news ahead of the Treasury Round Table next week, where Jim Chalmers has called for big ideas to reform.
“First, ideas should be put forward in the national interest, not through the prism of sectoral, state or vested interests. Second, ideas or packages of ideas should be budget neutral at a minimum but preferably budget positive overall, taking into account the necessary trade‑offs. And third, ideas should be specific and practical, not abstract or unrealistic,” he said in June.
Accordingly, as the Productivity Commission responds to these calls, they risk aligning with major tech companies seeking broader access to Australian consumer data. This mirrors recent backlash in the UK, where investor Marc Andreessen publicly criticised the Act’s mandatory age-verification rules. At the same time, platforms such as X (formerly Twitter) have condemned the law as a threat to free speech, warning it borders on censorship.
Last week, just hours before the IAB Data & Privacy Summit kicked off, the Australian Government’s Productivity Commission released its Interim Report: Harnessing Data and Digital Technology to Improve Productivity, presenting a set of proposals for the Government that could fundamentally reshape how data is governed in Australia.
Dual-track regulation is proposed…and it changes everything
As Peter Leonard, principal and director, Data Synergies, put it, “The past is not a guide to the future.” What followed at the event was a pointed discussion on AI, consent, clean rooms, and the shifting definition of “fair and reasonable” data use.
The Productivity Commission’s bombshell proposal? A dual-track privacy regime. As Leonard outlined at the event, organisations may soon get to choose between:
- A stricter consent-based regime (tranche 1.5 - a scaled-down second stage of Privacy Act reforms), or
- An outcomes-based alternative, where justification of consumers’ data use as being in the best interests of the individual.
This second option is being described as a “safe harbour”, a regulatory pathway where, if you can prove your practices meet certain outcome-based standards, you avoid some of the more prescriptive informed consent requirements.
For advertisers investing in media or utilising data, that may sound promising, but Leonard warned, “One thing to pause and think about is how the tranche 1.5 track might develop if in fact that alternative is made available. And there is always a risk that when the government allows a safety valve or a safe harbour that they kind of double down on the other side and make the other side requirements even more onerous because an organisation has the choice of the safe harbour.”
He also warned that success under either track won’t come from last-minute compliance scrambling, it will require building data literacy across the organisation. That means agencies and businesses need to train internal teams on how data is collected, used, and governed, rather than outsourcing every decision. And it means lawyers themselves need a deeper technical understanding of the adtech and martech systems their advice covers, something they won’t get without education from their clients and agency partners.
What’s more, the Commission is actively examining whether global AI players should be allowed to mine copyrighted content without consent under this same safe harbour logic, raising major questions about data value, ownership, and control in AI development.
The OAIC’s TikTok pixel investigation has serious flow-on effects. It wasn’t just TikTok under the microscope, it was any business that allows third-party trackers on its site. “You can’t say, ‘we didn’t know’ anymore,” said Leonard. “That’s not a defence.”
The ruling was, in Leonard’s words, “a shot across the bow” for website operators. Even if you’re not directly collecting or using the data yourself, you’re still responsible for ensuring any pixel activity on your site complies with the Privacy Act.
Nick Hayes, head of digital at The Media Store, pointed out that many SMEs selling through ecommerce platforms like Etsy or Shopify have no idea what tracking is happening behind the scenes. In some cases, their sites are set up, and even run, by a tech-savvy friend or younger relative. Placing ads through the platform’s CRM or ad manager can automatically embed tracking for impressions and reach, often without the business realising it. The “fair and reasonable” test for such data use remains vague, and what once required a complex technical integration is now as simple as ticking a box, making it far easier for data to start flowing without proper oversight.
Louder’s view: Right now, three actions should be at the top of your list. Start by auditing your measurement tools so you know exactly how they work, what data they collect, and what’s switched on by default (for example, Facebook and TikTok pixels scraping user data). Next, involve your legal teams to ensure these behaviours match your site’s terms and conditions. Finally, be transparent with end users: clearly explain how their data is collected, why it’s collected, and the value they receive in exchange.
Clean rooms aren’t automatically compliant
Clean rooms have been hyped as a fix-all. But the panel made it clear: not all are equal.
Some still operate as a black box, so brands don’t know where their data goes, how it’s handled, or who sees it. Others require you to hand over your data to a third-party environment instead of keeping it in your own tech stack.
Adele Burke, sales director ANZ, InfoSum warned: “Just because it’s called a clean room doesn’t mean it’s safe.” She advised brands to check if differential privacy is built into the tech, and whether the solution is independent and agnostic enough to securely connect to partners.
Danny Tyrell, co-founder, DataCo, added: a clean room doesn’t automatically mean compliance, if outputs contain individual identifiers, it’s still personal information under law. Architecture matters. Decentralised models with strong controls can ensure identifiers never leave without permission, but poor implementation can undermine even the best technology.
Leonard agreed, stressing the issue is often less about whether a clean room is “clean” and more about the form of outputs allowed, and the protections on who receives them. Under privacy law, generating an inference about an individual, even from synthetic data, counts as personal information. In an AI-driven world, he added, the fidelity of data will become as important as privacy.
Louder’s view: Brands must take full ownership of their technology stack, with a clear understanding of the risks in relying on black-box solutions. Encryption alone doesn’t guarantee privacy. What matters is the output, what it is, how it will be used, and what that implies. Just as importantly, understand where your data physically resides and under whose jurisdiction. Data sovereignty can be as critical to compliance and trust as the technology itself.
Synthetic data and AI are eroding trust in data fidelity
The more AI trains on synthetic data of questionable quality, the greater the chance your outputs may drift from reality. It’s a compounding risk, and one that’s often overlooked in privacy debates.
“AI is starting to retrain on data created by other AI,” said Tyrell. “What happens if that data was a hallucination?”.
Leonard echoed the concern, warning that fidelity, not just identifiability, will be a major challenge in future governance models. It’s no longer just about protecting data, it’s about knowing if your data is even real.
Louder’s view: Strong data governance isn’t just about security or compliance, it’s about reliability. Bad data leads to bad decisions, no matter how good your strategy is.
Agencies can’t afford to sit on the sidelines
Agencies are now in the hot seat. With overlapping roles in media, martech, analytics, and data strategy, clients are looking to agencies to connect the dots. But there’s a line between helping and overstepping.
“If you want to play Deloitte, you need to take on Deloitte-level liability,” said Tyrell.
That means knowing when to consult, when to collaborate, and when to call in legal. Many independents are setting up tags or advising on data collection without fully grasping the risk.
Louder’s view: We lead with clarity. We’re strategic partners in privacy-aware data practice, but we also know where the legal line is. We encourage clients to engage their legal teams early, help educate them on our data work (including audience targeting), and keep them informed on relevant policy changes. Privacy now sits alongside performance as a core part of our offer.
Louder’s recommendations
- Implement consent and preference management - ask consumers for permission to exchange their data for access - services, content, convenience and personalisation. Communicate openly with clients so they understand the value exchange and can make an informed choice.
- First-party implementation: Client owned technology, accessed in a compliant manner, with clients’ consent, a first-party data strategy could help with more precise targeting.
- Audit every measurement: Using third-party tech still means first-party responsibility. Collect only the data you truly need, minimise excess, and clearly document its purpose.
- Get privacy fit, fast: Assess whether your data use genuinely benefits consumers, the proposed dual-track model will favour those who can prove it. Even if it never becomes law, it’s a timely reminder to scrutinise the ethics behind how you handle data.
- Educating your employees and legal teams on privacy changes, to ensure data collection and usage complies with new rules and align with your company policy/ T&Cs.
- Don’t trust the “privacy safe” label: Verify architecture, governance, and output controls.
- Question your AI data: Don’t assume your models aren’t biased, regularly test to improve synthetic data quality and make models more neutral and objective.
- Independent agencies: Own your role, not your client’s liability. Support smartly, stay in scope, and define client responsibilities clearly in contracts.
Get in touch
Get in touch with Louder to discuss how we can assist you or your business and sign up to our newsletter to receive the latest industry updates straight in your inbox.