+7 votes

You have an ad set with creatives that are working well (and are hitting your CPA targets). Performance is steady and stable. You know though that performance can start to decline due to creative fatigue anytime. You have new creatives ready to test and run, but you do not want to rock the boat of the high performing ad set.

How do you approach this? some possibilities we have tried:

  1. Introduce some *new concepts* in the *current* ad set (risks: ad set learning can get reset, also new creatives may not get enough spend).
  2. Test *new concepts* in a *new ad set* targeting the *same audience* - but at lower spend. Add top creatives into the high-performing ad set. (risk -> audience overlap between the new & older ad sets assuming we have smaller audiences, although the impact of this can be limited because you are showing very different ads to the same audience.)
  3. Test *new concepts* in a *new ad set* targeting the *same audience* (but do this as an AB test within FB's interface so that we have a clear audience split between the old and new ad sets).
  4. Test *new concepts* in a *new ad set* targeting a *different audience* (perhaps broad audience) so as to avoid audience overlap.


Which approach have you found effective?

by (1.5k points)

2 Answers

+3 votes

This is a topic that is near and dear to my heart. My thesis is that with black-box, algorithmic campaign optimization, the only real levers that marketing teams have to pull now in improving performance are budget, creative experimentation, and event experimentation. This new environment has shifted the way that marketing teams work pretty dramatically: in building a systematic growth optimization process, they should be focused not only on producing new creative but producing new, radically experimental creative concepts to try to maintain performance.

image

Two other concepts that I've put some thought into and that fit within this paradigm are:

  • Fundamental long-term performance of a creative. If a creative was left running on a very long timeline, its performance would settle at some level for the long term. If you think about the overall lifecycle of a creative, the period of elevated performance is relatively short compared to that long-term timeline; therefore, its hypothetical "fundamental mean performance" is only slightly above that long-term performance level (and far below its high water mark level of performance).

    Advertising performance across a portfolio of creatives is mean-reverting, so the way to maintain high performance with creative is to be refreshing some percentage of your basket of creatives on such a regular basis that none of them ever deteriorates past the "New and Exciting" phase of their lifecycle (from the diagram). If you can track the creatives individually but lean on the performance of the portfolio as a whole, you can maintain a higher level of performance than the combined long-term mean;
     
  • Performance "Half Life" of a creative. All marketing creative has a performance half-life, and the key to maintaining performance is starting the replacement process of a highly performant ad creative before it ages past its half life (and progresses into the "Rapid Performance Deterioration" phase from the diagram). I've seen teams squeeze the performance out of ads and not think about how they'll replace them until near the very end of their useful lives. You don't want to see cyclical performance ups and downs with your campaigns as new creative outperforms and then becomes saturated and underperforms -- you want to pre-empt that deterioration phase by experimenting so heavily that you're always producing outperformers that keep your average performance high.

So to reframe your question in this context: How can a marketing team ensure that creatives are always replaced before they reach their half lives in a way that doesn't drive up CPMs from audience overlap?

I think your option 2 is probably the best approach: run a test campaign targeted to the same audience with a lower level of daily spend. I don't like running A/B tests on Facebook because the A/B test framework (sending equivalent amounts of roughly similar traffic to all variants) is not how Facebook distributes traffic "in the wild." Also, A/B testing creatives to ensure equivalent traffic exposure is slow and expensive. Facebook does a great job of evaluating creative and allocating budget to the best performing ads -- I don't see any reason to not just let Facebook do that even in a test setting.

So I think your best bet in this creative testing process is to constantly add new creative / kill non-performing creative in a permanent test ad set (in a different campaign, so as to separate performance weighting) and shift the best creatives into your primary campaign as you retire creatives nearing their half lives.

I've heard some people prescribe a specific maximum number of creatives to put into an ad set, but I don't think that can be generalized. With enough budget, an ad set should be able to support a large (20+) number of ads. You'll need to experiment with the optimal number of ads to maximize delivery across all vetted creatives within the confines of your own budget.

by (8.2k points)
+1 vote

2. Test *new concepts* in a *new ad set* targeting the *same audience* - but at lower spend. Add top creatives into the high-performing ad set. (risk -> audience overlap between the new & older ad sets assuming we have smaller audiences, although the impact of this can be limited because you are showing very different ads to the same audience.)

 

This option is the safest. 

  1. Depends on the lifecycle of the adset but FB is prioritizing adsets with finished learning phase or much higher spend. By that rule your performance should be safe if you will target same audience with same setting like you have in original adset for small testing and in the additional with the lower budget. But even though watch it for being sure.
  2. If your adset is performing well do not break it by adding new ads (you will start learning phase again). If you will notice visible drop of performance and you re not able to optimize by different method than changing creatives so then do it.
  3. With a good creative you are able to spend tens or even hundreds thousands dollars on it. The biggest problems to us always come with reseting learning phase by my experiences.

 

by (300 points)
edited by