
The web is filled with content explaining why incremental sales matter and why attribution can be misleading. Many marketers have turned to incrementality expert Avinash Kaushik for guidance — and if you’ve somehow missed this critical debate, it’s well worth catching up. What’s curious, though, is how many martech vendors still seem oblivious to the idea that incremental sales are what truly count.
Adding tests for incrementality wouldn’t just unlock dramatic ROI gains for users of these tools — it would also spare us marketers the cringe-inducing task of shading campaigns that were poorly designed, poorly executed and falsely celebrated.
Attribution sucks
Attribution is the typical way martech measures performance. It’s the adult marketers’ equivalent of the childhood “you touched it last” argument. The logic is attractively simple: if someone was shown an ad, received an email and then went on to take an action that qualifies as a conversion — say, a signup, download or a purchase — then it must have been the marketing activity that caused it. Right?
No, not always. Imagine you run an ecommerce site. You have a unique name, so you own the first page of search results when someone types your brand into Google. Let’s assume no competitors run search ads against your brand, so you own the SERP.
In this situation, the Google Ads campaign that will be attributed the best ROI is almost certainly one that bids on your brand. Because you own the SERP for your brand, it’s virtually certain anyone searching for that word is looking for you. Placing a Google search ad on a page of results you already dominate will not change whether people reach your site after searching. However, many people will click on your ad because it’s at the top of the page, so Google’s attribution algorithm will tell you that you have achieved fabulous results.
Think about it: you’ve spent money, seen no real improvement in results and ended up with a painfully negative ROI. Yet attribution tells you the campaign was a success. Crazy, right?
Dig deeper: How attribution masks what’s driving growth
What about A/B testing?
“Wait!” I hear you say, “Don’t we have A/B tests?”
We do. If you’re honest, however, you’re probably using them to test different creative approaches, not the difference between running or not running the campaign. All but the most basic marketing automation platforms offer easy, one-click A/B tests between different emails. Yet you must manually segment and spend a lot of time setting up a test if you want to see the impact of sending the email vs doing nothing. It’s not laziness on our part; it’s a lack of features on the platform.
In many situations, doing a proper test to see if you have increased sales is challenging. Let’s revert to the Google search ads example: how do you know if running ads has a positive impact?
Google won’t tell you the difference in conversions between people who clicked on the ad, saw the ad and those who saw no ad. Although you can pull data on site visitors who clicked on an ad versus those who didn’t, you’re still not going to be able to identify who saw the ad and didn’t click. Why not?
Dig deeper: How smarter measurement can fix marketing’s performance trap
The arrogance of marketers
At its core, the issue comes down to what I call “the arrogance of marketers,” — though, in truth, it’s more about the arrogance of martech vendors. There’s an underlying assumption that everything we do delivers a positive ROI, so all that’s needed is to “compare two different things and identify the better one.”
That logic is flawed. Let’s assume I devote half my budget to search advertising and half to email marketing. Yes, A/B testing using attribution will help improve my performance in each channel, but it won’t tell me how to allocate my budget. If the ROI for email is better, it might be because I’m emailing existing customers who would have bought more from my company, regardless of whether they saw the email. Yet attribution says the campaign worked. We’ve already discussed the attribution issues for search advertising, so hopefully, you see that the numbers there will not be helpful.
The incrementality test button
All I want to see is a simple “incrementality test” button in martech tools measuring ROI — a simple way to test the impact of running a campaign versus not running it. It would give us the data to tell whether campaigns are delivering results that help our organization achieve its goals.
I can hear martech vendors worldwide saying in a terrified whisper: “But what if it shows that campaigns don’t generate results that are more accurate than doing nothing?”
I’ve got some bad news for those vendors: Yes, some campaigns won’t deliver an uplift in results. That is inevitable and very, very rare. Many studies show that marketing activities improves organizational performance, so we don’t have to worry. Sure, it will stop investment in some campaigns, but they are ones that any professional, knowledgeable marketer would cancel anyway.
Which martech vendors will help me and make it as easy to test for incrementality as comparing the copy on two different ads? Let me know. I’m sure it’s a feature that will win you — and us — many more customers.
Dig deeper: Measuring marketing incrementality — Best of the MarTechBot
The post Why don’t martech vendors provide a tool for the metric that matters? appeared first on MarTech.