5 Mistakes You Are Making In Creative Testing For Performance Advertising And How To Fix It

Do you remember these times when all marketers cared about campaign setup, bidding, and targeting? I do! It’s a mistake that is constantly made even by experienced marketers. These days when Facebook and other platforms like TikTok, Snapchat, or Pinterest have enough data to build a block of hard drives reaching to the moon (joking!) We all know that creativity is the most important factor in your campaigns. Also, as Naval Ravikant once said — it’s barely impossible to automate creativity.

A campaign with a bad setup but great creative will outperform a campaign with a great setup but bad creatives” – Paulo Coelho (if he would do performance marketing, it’s not a real quote FYI).

But, how to find creatives that work and how to improve them constantly based on current learnings? Let me explain how we’re doing it. In 2020 (I know, not the best year) we cumulative spent around $3 000 000 on Facebook alone, and all these learnings are based on this knowledge.

After you finish you’ll know:
  1. How to prepare the first iterations of creatives? And why concept thinking is crucial here.
  2. How to store all your creative data — PRISM.
  3. How to maximize ad platform algorithm to fish winning creatives.
  4. How to be a snail of testing and why you need to be a snail if you want to be like a hawk and owl.
  5. Finally how to organize learnings to make your own creative bible (where you’re the Jesus of advertising).

So, do you have coffee in your hand? Ready to crush it? I just took a sip of Brazilian coffee from my favorite cup so — let’s start!

Sin #1 — you’re not testing enough creatives

How to prepare the first iterations of creatives? And why concept thinking is crucial here

How to find winning creatives? Unfortunately, it’s not so easy. We literally spent millions of dollars on creative testing and I can swear to high ROAS that very often our gut is in opposition with what’s working. You might say — my clients are strange, they don’t like my perfect ad but it’s not that they are strange, it’s you who is unprepared.

When we start with a new client we don’t know what concept will work. Our task is to prepare new creatives to make them work from the first iteration. If a client has past data we can analyze it as well but we never abandon proper research.

Concept thinking to find winning creatives. As you don’t know what’s gonna work try to prepare a few concepts to test. We define a concept as a group of creatives sharing a similar approach to the product our client is selling. Treat it as a box with candies on Halloween. Each box is a different type of candies but all of them have the goal — to avoid treats. Going from the top we build creatives from concept to angle to hook. Each concept can have many angles and each angle can have different hooks.

So if we would be selling rainwater tanks. Our concept might be “how to easily get free water for gardening purposes”. Our angle could be

1. “easy to set up” with these hooks

  • free water in less than 5 minutes
  • setup this in 5 minutes to save rainwater or angle B

2. “water for free” with these hooks

  • Only idiots pay for water in the garden or
  • US citizens can piggyback Canadians to get free water for gardening

Is this clear?

To get learnings fast, test a lot of concepts as fast as possible:

1. As we don’t know what concept will work we try to prepare a number of concepts to test.

2. If we do 20 ads, let’s prepare 5 concepts with 4 ads each. Each ad is ‘attacking’ from a different angle, showing different problems, desires, features, or benefits of the product we’re advertising.

3. Prepare testing campaign — how to effectively test 20 creatives at once? What works on accounts we managed it’s what we call Ads Incubator — a campaign where we rapidly test all creatives at once to find winners.

Let’s move to point 2:

Sin #2 — you don’t store creative data

How to store all your creative data — PRISM

If you run 5–10 ads per month it’s not hard to keep track of all ads. If you test 20 per week it’s much harder. If you multiply this by 10 clients you have it gives you ~200 ads to manage. Per week. What’s the challenge here? To store all creative data, for all ads in 1 place and merge it with performance data. It’s not impossible, it’s actually how our PRISM tool works.

We code all variables into the client’s PRISM which is connected to Facebook API. Thanks to this we have one place where we keep info about all creatives we tested, all their creative variables and we merge it with performance data. How cool is that? Really cool! Thanks to this we know not only which ads work best but why they work. How?

Imagine this beautiful world where Facebook is predictable without bugs on Black Friday, CPA is always below KPI and you have access to the creative tool with performance data where you have all creative variables info about all ads. As we can’t guarantee the first two, we can show you how to prepare the last one.

  1. Build table: our go-to tool for this is Airtable. It looks nice, works fast, and connects with Facebook API easily. One database for each client. All ads in one database. Each row represents one ad.
  2. Build variables: how do you want to measure your ads? Start with basic things like if an ad is an image, video, or carousel. Next add some creative data like concept, hook, angel. Why don’t you add creative format (if the ad is a meme, testimonial, influencer video)? You can code which product you advertise in this particular ad or which value proposition you’re trying to sell. You can also measure your team’s performance when you code copywriter, designer, strategist variables into the ad. As an old Google saying stays “the more data you have, the better data you have”. When your variable table is ready you need to code things like CRID (usually in the first column) which stands for CReative ID and number (a different number for every ad), for example, CRID00007.
  3. Connect data: ask your dev to connect your client’s Facebook ad account with your PRISM. First, you need to add performance columns into your prism — if you’re like us and doing ads for mobile applications you want to have columns like ad spent, completed registrations, cost per completed registration, app installs, cost per app install, landing page views, cost per landing page view, link clicks and cost per link click. When variables are ready, your dev needs to map them with FB events and connect API to get daily or hourly updates. It’s that simple.
  4. OK — your PRISM is ready. You spend some money on Facebook. Now what? Now it’s time to what we like the most — data analysis. You can either do it inside Airtable (but it’s far from perfect due to Airtable limitations), you can build your own dashboard (but it’s costly) or you can download PRISM data, put it to work in Google Spreadsheet, and using pivot tables you can build as many breakdowns, graphs, data sets as you want. Tadam — you have a working tool for advanced creative analysis. Your father is proud of you now!

PRISM is a great tool to keep all of your data organized. In minutes you can prepare simple analysis like which format works best to some advanced analysis like which hook works best with which angle with which influencer. Pretty advanced stuff which will make your creative product much better and more effective than the competition who didn’t read it ;)

Sin 3 — you don’t know how to test properly.

How to maximize ad platform algorithm to fish winning creatives and what to do next

Setup isn’t important as creatives themselves but if you want to be systematic, organized, and effective you need a good system in place. Try to find the perfect setup for each (it might be a bit different for each of them) but keep the setup constant (as you want to test only one thing at a time). Here is how we do it:

  • Setup: 1 campaign, 1 ad set, all creatives inside this ad set BUT this is the thing which you want to test and it depends on many factors. We have seen accounts where grouping ads by concept worked best but also where each ad was in a separated ad set. Test all setups to find what’s working best for you and keep in mind that it might work differently for each of your clients.
  • What budget do I need? Don’t put more than 20% of the total daily budget into testing to keep the account healthy and keep the CPA stable. Ads incubator budget should be determined by your CPA and the number of ads. You also need to keep in mind that Facebook needs 50 conversions to exit the learning phase. In 95% of cases, 2–6 ads will take over around 80% of the budget, be aware of this. Let’s do the math 50 (conversions minimum) * $5 (CPA) gives us $250 per day. Can’t afford it? You can split this into the whole test (3–4 days) but keep in mind that you gather fewer data.
  • Anything extra? You can set up rules to pause ads with high spend and no conversion (it happens as Facebook pushes ads with high engagement first but ad with high engagement isn’t always the best in terms of CPA or ROAS), it will save your budget. Also, if your CPA is really high (like $100) and you don’t want to spend tons of money on testing, determine winning ads on the higher funnel metrics (like app installs, clicks, etc.)

Sin #4 — you don’t know what has an impact on your creatives

How to be SHO?

SHO stands for Snail, Hawk, Owl. I know your initial thought “how do we jump from performance advertising to be in the zoo?!” But let me explain why I put so many animals in this paragraph: snails are slow, the hawk is precise and the owl is smart — same as you want to be with creative testing. You need to be slow (unless you’re rich as Apple and can afford everything, even triple ice-creams ;)) to not test too many variables at one time — it’s really important to know which change brought good results. Hawk — this bird is precise as you want to be with creative ideation and improvements. You want to know what, how, and why you want to change it. Last but not least — an owl — the owl is smart — this is how you need to be smart, as the chaotic process of creative testing sucks — you’ll spend thousands or millions on ads and you won’t be able to repeat your success or avoid your mistakes.

When we do creative testing we follow these few principles

  • Test many concepts at once. You throw many things into the wall and see what sticks. Quantity gives quality.
  • Focus on what’s working and pause what isn’t working. It’s crucial at the beginning to find at least 1–3 concepts that work and then iterate on them.
  • Be slow with changes, think about what can have a huge impact on winning creativity. Sometimes it is a matter of the font, sometimes you need to tweak copy or add/delete something. In general, you can follow this concept.
(found on the Internet without the author)

We do small iterations on winning concepts/ads, all these changes are logged in PRISM. How? We have a few extra columns:

  • Iteration — it’s empty for most of the ads but when we play with winning creative we use this column, we fill it with simple numbers like v2 which means it’s the second iteration, v3 means it’s third, and so on. The more important column is “change log” (was/is/changed) where we store info on what we have changed (I’ll explain it on the above pizza example), let’s assume we know that pizza is winning:
  • Was: (how it was in the first winning ad) pizza was a winning concept
  • Is: now it’s pizza with either pineapple or ham
  • Changed: added pineapple to the one variant and ham to the second variant

Do you catch it now? Thanks to this, we’re able to track changes and the impact of each change. As it’s more like a description it’s much harder to group it in pivot tables but what’s most important is that you have a changelog for creatives. You see how it evolves. If you want to be pro pro pro you can prepare a change log in Figma (or any tool) which looks similar to the pizza example where you have a visual representation of all your changes. How does it impact CRIDs? Now, instead of CRID00007 you would have CRID00007v2, CRID00007v3, and so on — so you know from where this ad comes from.

Sin #5 — you don’t keep learnings in one place or you don’t have them at all.

Finally how to organize learnings to make your own creative bible (where you’re the Jesus of advertising).

You’re still here? Nice, respect! You now know how to prepare first concepts, test them fast, code variables, and keep track of changes. Now — how to transform this into learnings? You can do it in a few ways, but (unfortunately to “table people” like me) you need a description for this so it’s harder to analyze it in the bulk.

  • Visual representation of changes: if you’re motivated enough, you can prepare a “pizza map” for winning creatives — thanks to this everyone can catch really fast what changes were made, how it evolved, and what’s the final results. The disadvantage is that if you have many iterations it’s hard to manage it.
  • A written description of changes with key learnings: if I would have to select one way to store learnings I would definitely select this one. Even when I am a “data person” this way is safe, easy to consume, and track all the steps. Usually, the creative team is responsible for that as they know best how each creative changed and what was the learnings (eg: we were testing pizza vs. burger example. Pizza won so we decided to test plain pizza as a control ad with 2 new iterations: pizza with pineapple and pizza with ham). Notion is a great tool for this.
  • Log in Airtable: this is the simplest solution, not the best one but the main one. You can filter each winning creative and see how it evolved with time. Based on this we write our written description.
I am really amazed that you have reached this point, it wasn’t easy but now you’re well prepared with the knowledge how to test and improve creatives in performance marketing. You know you need to eat healthy, do some sports everyday, limit your phone usage but you don’t do it, right? It’s the same with this system — as long as you won’t do it, you won’t improve your creative testing process. So put your shit together and go do some tests.

Latest blog posts

LOWER CAC

40%

Reduction
on average

TEST FASTER

10x

Faster than hiring
creatives in-house

SHIP MORE

3x

Faster turnaround
for ad creatives

REDUCE COSTS

25%

Reduction in average
cost per ad creative

Moshi
Eezylife
Investor
Canon
lingoda
Anyfin
Treecard
Cleo
Current

We can start today

  • Your ad creatives tagged with 100+ data points
  • Your creative ad success unlocked in 4 weeks
  • Your user acquisition efficiently scaling
GET in touch