My first experience with conversion lift/incrementality testing came in ~2013 while at LivingSocial. At the time, we were using Criteo for dynamic retargeting of our local deals offers. We ran an incrementality test that showed a ~.5% lift in sales for our test group, but, when you factored in the cost of the media itself, it was effectively a breakeven campaign. Criteo's reporting pegged the campaign at a 7x ROAS using their default attribution window. In the end, we continued to run the the campaign at breakeven, as we believed there was some inherent value in the impressions themselves. From that day forward, I've been obsessed with incrementality testing, and generally skeptical about paid retargeting and view-through attribution.
With that obsession in mind, I'm happy to share the results of a number of lift tests that the Thesis team and our clients have run over the last 6 months.
This brand has a conventional funnel (click -> lead -> subscription), and we ran a handful of typical retargeting audiences (page visitors, leads etc.) in our retargeting campaign. I was deeply skeptical that this retargeting campaign as a whole was accretive, and I was especially concerned about the leads audience, given those leads are very aggressively emailed to push them over the line to subscribe.
In this case, the cost per incremental subscription was $52 (well within our CPA target), and most importantly, the conversion lift percentage was 661.2%. Retargeting really moved the needle here to my surprise.
This brand is a high consideration, high AOV product with a typical Shopify funnel. Our experiment looked at the incrementality of our retargeting campaign, which targeted standard audiences including visitors, add to cart, initiate checkout etc. We measured lift on three metrics: View Content, Add to Cart, and Purchase.
The lift test showed that our retargeting was having almost no impact on Purchase. The lift percentage was only 9%, and the cost per incremental conversion was $296. Meanwhile, for the same period the ads manager dashboard reported a $215 CPA with a 1-day click attribution window and a $31.59 CPA with the default attribution window (28-day click + 1-day view).
Notably, there was meaningful lift on the Add to Cart metric. For Add To Cart, the lift was nearly 50%:
One constraint of Facebook's lift studies is that they count conversions as tracked by their pixel only during the experiment's test schedule (in this case 30 days). Given this is a high consideration product with a long sales cycle, it's reasonable to assume that the campaigns during the test period might lead to purchases well after the test schedule, which their pixel reporting wouldn't capture by definition. We've struggled to reconcile the very reasonable hypothesis that purchases might be long delayed with the very poor incremental performance that we could actually observe.
One suggestion I received from a Facebook rep was to take Purchase data from Shopify, upload it as Offline Conversion dataset, and use that offline purchase event as another objective for the experiment. That way any sales generated "offline" even well after the test schedule period could be attributed back to the campaigns during the test schedule and thus would be included in the incrementality results. To be honest, we haven't had the opportunity to try this yet.
After seeing the results described , we ran a lift test that included all campaigns (prospecting + retargeting) for 30 days.
In this case, the results were more positive, although we saw a relatively modest overall ROAS.
The implied ROAS lift here is only 1.2x, but again, we have to weigh those results against the nature of the product's sales cycle.
This brand has a relatively strong trial offer which leads into a regular subscription cadence. Again, we ran a relatively conventional set of retargeting audiences, and looked at incrementality based on Add to Cart, Initiate Checkout, and Purchase.
We saw a 41% lift on the purchase event, and an incremental CPA of $62.89. For that period, our 1-day click CPA was $67 and our default CPA (28-day click + 1-day view) was $32. In this case we've just modified our thinking a bit to set goals with a more conservative attribution window in mind.
Facebook's Business Help Center has a number of excellent pages on Holdout Tests. I suggest you start here. Though they aren't included in the Experiments interface on Facebook (as far as I know, at least), you should note that your Facebook rep can help you configure "Power Lift Tests", which combine conversion lift testing and brand lift testing into a single experiment.
If you have any critiques of my analysis above please let me know (email@example.com). I'm always eager to learn more about incrementality. And if you need help in running this sort of an experiment obviously just let me know. In the future, I plan to share more incrementality results on this blog from Facebook and hopefully other channels too.
We've seen surprisingly strong performance using local TV news content in our online paid ads. In this post, I provided an overview of local TV lifestyle programming and how we use that content to drive paid performance.
We (along with the entire industry as far as I know) have seen Facebook's performance decline since Apple's introduction of ATT. Over the last 12 months, we've made channel exploration and expansion a core focus.
Our friends at Nest Commerce recently published their Readout for Jan 2023. In it, they discuss a number of trends they see impacting D2C. Their graphs comparing 2021 and 2022 performance on Black Friday caught my eye as they saw a considerable improvement...