icon
June 18, 2025

The Creative Attribution Problem That's Wasting Your Ad Spend

By

Neil Pursey

You're staring at your Monday morning dashboards. Creative A shows 2.3% CTR on Facebook, Creative B shows 1.7%. On Google, it's the opposite - Creative B leads with 2.1% CTR.

Which creative is actually better?

If you're like most media managers, you'll make budget allocation decisions worth tens of thousands of pounds based on these numbers. But what if both dashboards are lying to you?

What if the creative with "worse" performance is actually superior for the 85% of your audience who matter most?

This isn't about platform differences or measurement discrepancies. It's about a fundamental attribution error that's quietly wasting millions in ad spend across the industry - and it's happening in your campaigns right now.

(Inspired by Peter Buckley's MWC (Minority Who Click) LinkedIn Post)

"One of the reasons we're building Maaten is because we kept seeing brilliant creative campaigns labeled as failures. The problem wasn't the creative - it was the measurement. When you can't separate creative effectiveness from audience click-propensity, you're making million-pound decisions based on platform bias, not business reality."

— Neil Pursey
, Co-founder, Maaten

The Hidden Problem: You're Not Measuring Creative Performance

When Creative A outperforms Creative B, you're not seeing creative effectiveness. You're seeing the combined effect of:

  • Creative quality (attention-grabbing elements)
  • Audience composition (who actually saw each creative)
  • Platform algorithm bias (who gets shown the ads)
  • Behavioural patterns (click propensity differences)

Most attribution systems can't separate these variables. They show you the combined result and call it "creative performance."

Here's the problem: You're optimising for the wrong people.

The Minority Who Click (MWC) Bias

According to Global Web Index, only 15% of people clicked on an ad last month. Yet WARC data shows that 56% of global ad spend uses click-based optimisation.

Do the math: We're spending more than half our budgets optimising for the Minority Who Click on everything.

The Minority Who Click aren't your typical customers. Research shows they're:

  • Predominantly older (65+ especially likely to click)
  • Often clicking by mistake (mobile mis-taps, accidental clicks)
  • Shrinking as a percentage of total online population
  • More likely to click on anything (not specifically interested in your product)

When you optimise for CTR, platform algorithms systematically favour users who click frequently. This creates the wrong feedback loop:

  1. Algorithm learns who clicks on your ads
  2. System shows ads to similar "clicky" users
  3. CTR improves but audience quality decreases
  4. Creative insights become skewed toward MWC preferences

The result? Your "winning" creative might be the one that appeals to serial clickers, not actual buyers.

How This Affects Your Creative Decisions

The Attribution Confusion

Let's say you're testing two creatives:

Creative A: Professional product shot, benefit-focused copy

  • 2.3% CTR, £18 CPL
  • Traditional analysis: "35% more effective than Creative B"

Creative B: Lifestyle imagery, brand positioning 

  • 1.7% CTR, £24 CPL
  • Traditional analysis: "Underperforming, needs optimisation"

Reality check: Creative A appeals to MWC (older, click-prone users). Creative B appeals to your actual target audience (younger, click-averse but high-value buyers).

The platform algorithm, optimising for CTR, increasingly shows Creative A to easy-to-click audiences while Creative B reaches genuine prospects who don't click but do convert.

Cross-Platform Contradictions

This bias explains why creative performance rankings vary wildly across platforms:

  • LinkedIn winner: Personality-focused creative (appeals to engagement-heavy users)
  • Google winner: Professional imagery (aligns with search context)
  • Facebook winner: Lifestyle content (algorithm favours high-engagement audiences)

You're not seeing creative effectiveness. You're seeing platform-specific audience bias.

Signs You're Stuck in MWC Optimisation

Performance Symptoms:

  • Decreasing conversion quality despite improving CTR
  • Audience composition skewing older over time
  • Cross-platform performance contradictions
  • Difficulty scaling successful campaigns beyond initial audience
  • High click-to-conversion drop-off rates

Operational Symptoms:

  • Creative testing results that don't make intuitive sense
  • "Winning" creatives that feel off-brand or dumbed-down
  • Performance declining when you try to scale winning creatives
  • Creative team frustrated that their best work "doesn't perform"

Strategic Symptoms:

  • Brand awareness declining despite high engagement metrics
  • Customer acquisition costs rising despite CTR improvements
  • Market share decreasing while campaign metrics improve
  • Difficulty attracting premium/younger customers

The Real Cost of Creative Attribution Errors

Budget Misallocation

Consider a £100,000 monthly budget split 70/30 based on CTR performance:

  • £70,000 to MWC-optimised creative (appeals to 15% of population)
  • £30,000 to brand-building creative (appeals to 85% of potential customers)

If the MWC creative drives 2x the clicks but half the customer lifetime value, you're:

  • Overspending £35,000 monthly on low-value acquisition
  • Underspending £35,000 monthly on high-value prospects
  • Net impact: £840,000 annual misallocation

Competitive Disadvantage

While you optimise for easy clicks, competitors who understand true creative attribution are:

  • Capturing high-value customers you're missing
  • Building stronger brand recall among non-clickers
  • Developing creative strategies that scale across audiences
  • Creating sustainable competitive advantages

Creative Strategy Drift

Over time, MWC optimisation pushes creative strategy toward:

  • Clickbait-style headlines that don't reflect brand values
  • Demographic targeting that skews older than intended
  • Creative elements that grab attention but don't build consideration
  • Short-term thinking that undermines long-term brand building

Why Traditional Creative Testing Fails

Problem 1: Confounded Variables

Most A/B tests change two things simultaneously:

  • Creative elements (image, copy, CTA)
  • Audience reached (due to algorithm optimisation bias)

You can't isolate creative impact when audience composition varies between test cells.

Problem 2: Platform Algorithm Interference

Each platform's algorithm has different MWC populations:

  • Facebook: Engagement-heavy users
  • Google: Search-active users
  • TikTok: Completion-focused users
  • LinkedIn: Professional context clickers

Cross-platform creative insights become meaningless when each platform delivers different audience compositions.

Problem 3: Survivorship Bias

Traditional creative testing only measures people who engage. The 85% who see your ad but don't click are invisible in your data - even though they might be your most valuable prospects.

Problem 4: Short-Term Focus

CTR-based optimisation prioritises immediate clicks over:

  • Brand recall building over time
  • Consideration development in non-clickers
  • Word-of-mouth influence from viewers
  • Purchase intent that doesn't require clicking

Quick Self-Assessment: Are You Falling Into the Attribution Trap?

Answer these five questions about your current creative testing:

  1. Does your "winning" creative feel off-brand or dumbed-down?
    • Yes = Likely MWC optimisation
    • No = Good sign, but check other indicators
  2. Does creative performance ranking change dramatically across platforms?
    • Yes = Platform algorithm bias affecting results
    • No = More consistent attribution (or you're only on one platform)
  3. Has your audience composition shifted toward older demographics over time?
    • Yes = Strong MWC bias indicator
    • No = Audience targeting may be holding steady
  4. Do your click-to-conversion rates keep declining despite CTR improvements?
    • Yes = You're attracting more clickers, fewer buyers
    • No = CTR improvements might be genuine
  5. Are you struggling to scale creative successes beyond initial test audiences?
    • Yes = Likely optimising for audience bias, not creative quality
    • No = Your creative insights may be more robust

Scoring:

  • 3-5 Yes answers: High probability of MWC bias affecting creative decisions
  • 2 Yes answers: Some attribution issues likely present
  • 0-1 Yes answers: Either good attribution discipline or need deeper analysis

What This Means for Your Creative Strategy

The Measurement Problem

The issue isn't that click-based metrics are wrong. It's that they're incomplete. CTR tells you about attention and engagement, but not about:

  • Who's actually seeing your creative
  • Whether those people are your target customers
  • How the 85% of non-clickers are responding
  • Long-term brand impact beyond immediate clicks

The Attribution Solution Preview

The fix requires separating creative effectiveness from audience effects by:

  1. Controlling audience variables in creative testing
  2. Measuring beyond clicks to capture full impact
  3. Understanding true business outcomes from each creative approach

This isn't about abandoning performance marketing. It's about measuring performance accurately so you can optimise for customers who matter, not just customers who click.

What's Next

This creative attribution problem is solvable, but it requires a different approach to testing and measurement. The solution involves three key changes:

  1. Buy on impressions to control audience variables
  2. Measure CTR as creative attention, not optimisation target
  3. Track incremental impact beyond clicks

Understanding the problem is the first step. The next step is implementing controlled creative testing that separates creative effectiveness from audience bias.

In our next article, we'll walk through the step-by-step framework for fixing creative attribution in your campaigns - including specific platform settings, measurement approaches, and common implementation mistakes to avoid.

Ready to audit your current creative attribution? Start by answering the five questions above and identifying which campaigns show the strongest symptoms of MWC bias. Those campaigns are your best candidates for implementing controlled creative testing.

The 85% of your audience who don't click are still seeing your ads. The question is: are you creating content that resonates with them, or just with the minority who click on everything?

Neil Pursey

One of the reasons we're building Maaten is because we kept seeing brilliant creative campaigns labeled as failures. The problem wasn't the creative - it was the measurement. When you can't separate creative effectiveness from audience click-propensity, you're making million-pound decisions based on platform bias, not business reality.