On Facebook, Ad relevance diagnostics compare your ads to ads using the same optimization goal that are competing for the same audience. They’re broken up into three rankings: Quality Ranking, Engagement Ranking, and Conversion Rate Ranking. According to Facebook, they’re made available to help diagnose underperforming ad creative. Historically at Thesis we haven't paid much attention to these rankings, but we wanted to do a more formal analysis to understand how these rankings actually correlate to the performance metrics our clients care about.
In the above table we pulled performance metrics in aggregate for 10 brands, and then pulled those same CTR, CPA and ROAS metrics for the ads labeled as Above Average, Average, or Below Average. We don't see any clear trends based on this data set. We had hoped to to see better CPAs and/or ROAS on higher ranked ads... but that was clearly not consistent. Notably, we did notice that higher Quality Rankings seemed to correlate with lower CPMs. And that does make sense intuitively, as Facebook defines Ad Quality as "How your ad's perceived quality compared to ads competing for the same audience." So they appear to reward ads with higher Quality Rankings with lower CPMs.
Engagement ranking takes into accounts actions such as clicks, reactions, comments, etc. to determine a score. When looking specifically at CPAs, we found that the ads ranked Average tended to have better CPAs and ROAS compared to the Above Average ads. While Above Average ads had a better CTR than Average ads in every account we looked at, we found looking at actual CTR provided more actionable insights, especially when comparing ads with the same Engagement Ranking.
This ranking compares the expected conversion rate of ads against other ads targeting the same audiences. While these rankings generally correlated with higher conversion rates when we calculated them ourselves, we did find that in quite a few of these accounts ads ranked average often had higher conversion rates than ads ranked above average, even with the same targeting. Given some of those inconsistencies and the large amount of unranked ads in all of these accounts, we found creating a custom metric for conversion rate to be more effective than relying on the Conversion Rate Rankings.
Unfortunately we were unable to find clear evidence that these rankings are more helpful than other KPIs (like CTR, CVR, Hook Rate etc) for performing day to day ad optimizations, especially when you consider that a large percentage of the ads we analyzed were entirely unranked. We imagine that with a much larger data set (like all Facebook advertisers) these rankings might be very useful, but at Thesis our practical takeaway is to not pay great attention to them.
At Thesis, we’ve developed a testing methodology that generates more learnings, quicker, while also protecting our core scaling campaigns from creative flops. We’ll show you exactly how we set up these creative tests and generate more creative wins for our clients.
Inspired by a client with whom we’ve seen delivery shift dramatically away from iOS and towards Android, we wanted to see if this was part of a broad shift on the platform.