29 April 2026

What AI still can’t fix in paid media

lighthouse

In summary

  • AI is improving performance, but it’s only as good as the signals it’s fed
  • Most performance issues aren’t algorithm problems, they’re data problems
  • Platforms optimise for what they can see, not what actually drives business results
  • The edge now sits in measurement, judgment and cross-channel decisions
  • Teams treating AI as autopilot are already drifting off course

Automation is working. That’s not the full story.

The last two years in paid media have been genuinely impressive.

Performance Max. Advantage+. Smart Bidding reacting faster than any human team could. Creative that used to take a week now turned around in hours.

And on paper, it works. Reported ROAS is up. Campaign structures are simpler.

But, the best-performing accounts still have people doing very deliberate work.

Not because they don’t trust the automation, but because they know where it stops being useful.

AI is doing what you told it to do

Performance Max doesn’t know your margin. It doesn’t know which leads actually convert. It doesn’t know which products lose money.

It knows the signal you send it.

So if you optimise to revenue, it will chase revenue, even if it’s unprofitable.
If you optimise to leads, it will find more leads, including the ones that never close.

Nothing is broken. The system is working exactly as designed.

The issue is the input.

And that’s where most underperformance sits: weak, noisy or misaligned conversion signals.

Fixing it isn’t glamorous, CRM integration, lead scoring, value rules, exclusions. But, it’s the highest-leverage work in paid media right now, and AI won’t do it for you.

Where the gaps still are

Measurement

Every platform reports its own version of performance.

Different attribution. Different lookback windows. Different incentives to claim credit.

Stack them together and you often end up with more “performance” than actual revenue.

Without a measurement layer above the platforms, incrementality testing, MMM, or at minimum a single source of truth, decisions are being made on inflated numbers.

Incrementality

AI is good at capturing conversions.

It’s not designed to tell you if they would have happened anyway.

Branded search, retargeting, lower-funnel activity, these look efficient because they are.

But they also inflate performance.

The question that matters, would this have happened without the ad?, still sits entirely outside the platform.

Creative and brand

AI can generate variations at scale.

It can’t tell you what your brand should sound like, or whether short-term performance is eroding long-term value.

That judgment still sits with people.

Context and allocation

AI optimises within a channel.

It doesn’t understand your broader business context, and it won’t tell you to shift budget elsewhere.

Cross-channel decisions, where to invest, where to pull back, only happen when someone steps outside the platform view.

Where teams are over-relying

This is where things start to quietly break:

  • Turning on Performance Max and leaving it unchecked
  • Feeding automation whatever creative is available
  • Trusting AI outputs without validating inputs
  • Dropping basic hygiene like search term reviews or exclusions
  • Training models on weak conversion signals

None of this tanks performance overnight. It just widens the gap between reported performance and actual business results.

The change that’s happening

The teams outperforming right now aren’t resisting AI, they’re using it properly.

They let AI handle scale, pattern recognition, bidding and variation.

But they stay close to:

  • what success actually looks like
  • which signals matter
  • how performance is validated
  • how budget moves across the system

That’s the difference.

AI hasn’t replaced the work. It’s moved it.

Louder’s recommendations

  • Fix the inputs first: Standardise conversion events, clean up duplication, and align signals to real business outcomes. If the inputs are wrong, everything downstream will be too.
  • Move control upstream: Server-side tagging, CRM integration, and structured data flows aren’t optional anymore. They’re the foundation of performance.
  • Validate performance, don’t just report it: Use incrementality testing, geo experiments or MMM to understand what’s actually driving outcomes, not just what’s being claimed.
  • Be deliberate with creative: Volume matters, but direction matters more. AI can generate assets, but it can’t define brand.
  • Manage platforms as a system, not in isolation: Each platform optimises for itself. Someone needs to optimise across them.

Get in touch

Get in touch with Louder to discuss how we can assist you or your business and sign up to our newsletter to receive the latest industry updates straight in your inbox.



About Anmol Kumar

Anmol Kumar is a Paid Media Consultant at Louder. In his spare time, you’ll likely find him dancing Bachata, traveling, or soaking up the sun at the beach.