The big breakthroughs come from bold ideas

A/B testing has become the comfort blanket of digital marketing. When in doubt, test. When questioned, show the data. It’s what you do when you want to look rigorous—or when you’re quietly unsure what to do next.

And yes, a well-run test can be valuable. It can sharpen decisions, reveal blind spots, challenge assumptions. But more often than not, A/B testing is used not to explore bold ideas, but to avoid them. It’s tactical hesitation dressed up as analytical confidence.

Microsoft, a pioneer in large-scale experimentation, knows this better than most. Former experimentation lead Ron Kohavi reports that while the company runs over 100,000 A/B tests a year, most result in only local optimisations. The big breakthroughs? They came from bold ideas, not button tweaks.

Still, testing persists—often obsessively. Headlines, CTAs, banner colours. We test what’s safe to test, in part because that’s where the tools are easiest to apply. But we rarely pause to ask whether we’re measuring what matters.

Especially on digital media platforms, the problem goes deeper. In practice, an A/B test there doesn’t just test audience response—it tests the algorithm’s behaviour. When you change an ad, you also change how the platform distributes it, to whom, and in what context. So what looks like audience preference might simply be algorithmic preference. You’re not always learning what you think you’re learning.

Behavioral science doesn’t let us off the hook either. Nudges can work—but they don’t necessarily stick. A study at Cornell found that putting healthy snacks in more visible locations led to more purchases, but not more consumption. The right cue triggered a response, but not a commitment. The metric moved, the meaning didn’t.

That’s the quiet trap of A/B testing: it creates the illusion of progress. It optimises the surface without ever touching the substance.

This isn’t a rejection of testing. It’s a reminder of its limits. A/B tests can validate a direction—they can’t define it. If your entire strategy is to run variants until something wins, you don’t have a strategy. You have a polite standoff with uncertainty.

Sometimes, the real leap forward isn’t A. Or B. It’s daring to ask something completely different.

Previous
Previous

The 70% rule

Next
Next

Google Shifts the Search Paradigm