Editor’s Note: This post has been updated with fresh links and new content for sharp-eyed readers like you 😉
Original Publication Date: March 20, 2017
Facebook ad testing would be so much easier if we lived in a fairytale world.
When creating a new ad campaign, we’d just go to our magic PPC mirror and ask: “Mirror, mirror, on the wall, who will convert the highest of them all?”
The room would go dark and a ghostly voice would whisper: “The second ad design from your left.”
Now that would be a slippery slope. (Because you know what happened to the queen with the mirror in the story of Snow White…)
So here we are, left with the good old Facebook ad tests to discover what works and what doesn’t.
But you know what? Testing can be a powerful tool if it’s handled with care. That’s why we thought it was time to share our favorite Facebook ad testing tips, ideas, and best practices.
Wanna get better ad results? Let’s look at how to A/B test the right way.
Why you need Facebook A/B testing
Unless you can see into your Facebook target audience’s minds, it’s impossible to tell what they really think or like.
Do people prefer light or dark images? Should you formulate your ad headline as a question or a regular sentence? What’s your best-performing value offer?
Have you optimized your ad delivery the most efficient way possible? Are you even targeting the right ad audience?
Split tests can answer all these questions… and help stop your gut feelings from having too much of a say during the ad creation process.
Instead of one team member saying “We should go with a photo ad” and ending the discussion there, marketers can set up multi-variate tests to see what works best in action.
Facebook Ad testing in action
Facebook A/B tests can reveal some pretty exciting stuff.
They discovered that a Facebook ad headline that included an emoji had 241% higher click-through rate than the ad with no emoji.
That’s not a bad Facebook ad tip, especially if you’re planning to spend thousands of advertising dollars on similar campaigns.
The main reason for Facebook ad testing is to discover new effective ways to spend your advertising budget while getting the maximum result.
But there’s also one another incentive — the testing process can be awesomely fun.
How Facebook split testing works
Facebook A/B testing starts with the setup process where you create multiple variations of an ad or ad set.
After you’ve set up a testing campaign, Facebook will show your ads to the target audience. And after some time, you’ll be able to draw conclusions based on the Facebook ad testing results.
In addition to ad elements like image and ad copy, you can test other important aspects such as different audience types, ad placements or delivery optimization settings.
An oversimplified version of Facebook A/B testing goes along the lines of: You decide on what to test, set up a test, and collect the results.
In reality, it’s not that simple. Mastering each of these processes takes some trial and error.
However, it’s completely doable (And I’m going to show you how).
Facebook Ad testing campaign structure
Facebook ads have three levels: Campaign, Ad Set, and Ads. Each of these levels comes with different split testing opportunities.
On the Campaign level, you can choose between various campaign objectives:
Your Facebook campaign objective determines what ad template, delivery optimization, and bidding options your ad sets will have.
For example, the campaign objective Conversions will let you optimize your ad delivery for:
- Link Clicks
- Daily Unique Reach
Tip: It’s best to stick to the campaign objective that’s closest to your advertising goal, as it’s the optimal way to maximize your Facebook bidding strategy.
If you’re unsure which campaign objective to use, you can create two identical campaigns with different campaign objectives.
Facebook A/B tests usually aren’t conducted on the campaign level, however, as there are more substantial testing options on the Ad Set and Ads level.
It might make more sense, then, to test various ad elements within a single campaign and avoid split testing two campaigns.
Ad Set level
On the Ad Set level, marketers can test:
- Ad delivery optimization methods
- Ad placements
- Bidding tactics
- Target audiences
If you want to test different ad set elements, make sure that each of these ad sets contains similar ads. Otherwise, you won’t be able to get relevant Facebook ad testing results as you’ll have no idea which element contributed to one ad set’s superiority over the other.
AdEspresso analyzed over $3 million in ad spend and discovered the ad elements that affected customer campaign performance the most:
- Precise Interests
- Mobile OS
- Age Ranges
- Relationship Status
- Landing Page
- Interested in
Many of these audience targeting factors and ad set elements can be A/B tested by creating multiple Facebook ad sets, each containing a varied version of the tested element.
On the Ad level, you can test pretty much everything that will be visible to a person seeing your ad. This includes:
- Ad type
- Images or video
- Ad text
- Link description
The golden rule of Facebook Ad testing
If you intend to test something on the Ad level, keep your Ad Set and Campaign variables unchanged.
If you test too many things at once, here’s what will happen:
Another problem that you might run into is Facebook’s auto-optimization.
This means that Facebook will soon start to deliver the ads that have the highest click-through rates and lowest CPC across your ad set. However, sometimes Facebook makes the decision too quickly, leaving you with now relevant A/B testing results.
To get the auto-optimization out of your way, create separate ad sets for each ad variation, and let them run simultaneously.
However, you may then meet another problem — audience overlap. This means that the same person will see both of your ads throughout the campaign period. However, that’s usually the lesser evil of Facebook split testing problems.
Facebook A/B testing best practices
We know how difficult it is to not jump straight to the Ads Manager and create a new test RIGHT NOW.
But if you bear with us a little longer, we’ll help you get high-quality results from your split tests.
One of the biggest Facebook ad mistakes both rookie and seasoned marketers make is testing everything at once.
Here’s some bad news, though — it doesn’t work that way.
Let’s say you wanted to test three different target audiences, two ad headlines, and four ad images.
As a result, you’ll get three ad sets with eight ads in each. You’ll either need to spend thousands of dollars worth of budgets or you’ll get statistically insignificant results.
The more ad variations you have, the more impressions you need to gather for the results to be statistically significant.
What to track for effective Facebook Ad testing
Instead of website visits, enter the number of impressions on your ad or ad set.
Instead of website conversions, enter either the total number of ad clicks or the number of campaign conversions per each ad or ad set.
Tip: Wait at least 24 hours after publishing before evaluating your split test results. This gives Facebook algorithms time to optimize your campaign (You’ll probably need to wait for longer anyways to collect enough results).
Facebook even says that it takes a few run-throughs to get your ad delivery right:
“When we start delivering your ad set, whether at the start of a campaign or after you edit it, we don’t have all the data necessary to deliver it as stably as possible. To get that data, we have to show ads to different types of people to learn who is most likely to get you optimization events. This process is called the ‘learning phase.’“
There is no firm rule about the number of Facebook campaign results required for reporting, but I’d say it makes sense to collect at least 300-500 clicks per variation and reach at least 10,000 campaign impressions before drawing any conclusions.
Analyzing A/B test results
The goal of your Facebook A/B tests should be to discover at least 20% gains regarding the cost-per-result, but the difference between two variations can be as much as (or more than) 300%.
Define the most important ad metrics that have an effect on your Facebook ad testing campaign results and ultimately lead to sales — the end goal of most ad campaigns.
For example, this split testing campaign by AdEspresso had the goal of attracting sales. So it makes sense for them to measure the Cost Per Sale.
You can amuse yourself with other vanity metrics such as CPC (cost-per-click) or CPM (cost per 1000 impressions), but take these with a grain of salt as they’re not indicating how much it cost to get a conversions.
There are also times when cost-per-mile is your go-to metric (e.g. when conducting a brand awareness campaign with the goal of getting the maximum amount of ad impressions)
The A/B test reporting rule goes like this: Measure the cost of your ultimate campaign goal.
Facebook split testing budgets
Another question we get asked a lot is “What’s the right budget for A/B testing on Facebook?”
Like with many things in life, the answer is “It depends.”
You can start by creating large multivariate A/B tests with extensive budgets. Or you can spend your limited advertising dollars on the quest to a single (but powerful) Facebook ad hack.
To calculate the anticipated cost of your Facebook split testing campaign, take a look at your other campaigns.
What’s the average cost-per-conversion across your existing campaigns? Use it as the basis for your calculations.
Let’s say you want to test three different target audiences and your average cost-per-conversion is $3.50.
As we previously explained, you need an average of 300-500 conversions per variation to get reliable test results. So you’ll need the campaign budget of 3 x $3.50 x 300 = $3150.
However, you can cheat a little and conclude your tests sooner if you see that one variation is clearly superior to others. Use the remaining budget for conducting another split test to affirm your results or test a new variable.
How to set up a Facebook A/B test
If you want to test multiple target audiences, ad bidding methods or ad placements, you should create multiple ad sets, each with a single variation.
Using Facebook’s split testing feature
Facebook’s Ad Manager update makes it easier than ever to split test ads within your campaign.
As you check the box before ”Create Split Test”, Facebook will take you through the entire process.
Here’s how the Facebook Split Testing Feature works:
Members of your audience are randomly divided into non-overlapping groups, and see ad sets with identical creative.
Each ad set has one distinct variable, no more.
Your test variable can be different audience types, ad placements, or different ad delivery optimization methods.
You can choose to split budget and reach evenly across ad sets, or weight one heavier than the others.
Facebook will measure the success of each ad set and declare a winner. After an initial Facebook ad testing round is complete, you’ll get a notification email containing the results.
Using the Facebook Ads Manager
If you’re using the Facebook Ads Manager to set up Facebook ad campaigns, start by creating regular campaign with a single ad set.
Then, duplicate your ad set and select a different target audience, ad placement or another ad set element.
For example, change your ad set’s targeting from a Saved audience to a Lookalike audience:
If you want to test various ad elements (image, ad copy, calls-to-action), create duplicate ad sets, and edit the ads inside each ad set to include a different variable.
Using third-party tools
One of the most convenient ways to set up multivariate A/B tests is to use a Facebook ads manager tool like AdEspresso.
In AdEspresso, you can add multiple set or element variations to your ads during the campaign creation phase, so that there’s no need to mess with the duplication process later on.
In addition to your ad elements, you can also test multiple target audiences…
Or select additional Facebook ad testing ideas.
Just because all of these tempting split testing options are available to you, though, doesn’t mean that you can test everything at once.
When testing out AdEspresso, creating a Facebook A/B test with three ad images and two headline variations took me about three minutes. Now that’s what I call turbo-speed test setup.
If you don’t have huge testing budgets, Facebook’s campaign management options might do the job. You won’t need them all too often as it takes time to gather relevant test results.
Tip: You can run multiple A/B tests at once, just make sure there’s no overlap in terms of target audiences and offers. (e.g. run one extensive split test for finding the right target audience and another smaller blog promotion test to see which headlines work best).
10 Facebook A/B testing ideas
And now, ladies and gentlemen, if your heads aren’t already packed with ideas, here’s more inspiration in the form of 10 Facebook ad testing ideas.
We’ve listed the split test plans we’ve found the most useful over time, returning the highest ROI in terms of findings.
1) A/B test target audiences
An AdEspresso analysis of Facebook ad statistics found that the audience you target can affect cost-per-click by more than 1000%.
So testing your way to the perfect target audience makes sense. A lot of sense.
Of course, once you get started, the multitude of Facebook ad targeting options can seem confusing at first: Saved audiences, Custom audiences, Lookalike audiences…
This confusing array of targeting options can be a huge advantage, however — you can create lots of highly relevant target audiences depending on the nature of your campaign and offer.
Target audience testing options
For example, you can target people based on their demographics and location.
We’ve seen again and again that geographic specificity in ads and landing pages leads to higher performance. With that in mind, we highly recommend checking out this targeting option. (We did put it first, after all.)
You can also test several Facebook Custom audiences to see which landing page visitors convert at the highest rates and lowest costs.
Another reason to test target audiences is to see which audience matches best with your ad offer.
While cold audiences may be more interested in low-threat offers, the warmer remarketing audiences could turn more easily into paying customers.
Best practices for A/B testing Facebook audiences
- Create at least two target audiences with little to no overlap.
- Use the EXCLUDE feature with Custom audiences to further avoid audience overlap.
- Create test audiences large enough to deliver sufficient results. (An audience of 1,000 might not be worth split testing, vs. an audience of 400,000)
- Test with different audience types: Saved audiences, Custom audiences, Lookalike audiences
2) A/B test ad placements
Facebook ad placement decides where your target audience sees your ads. And each ad placement can have a completely different ROI.
Ad placement options on Facebook
- Main feed (mobile and desktop)
- Facebook right-hand column
- Audience Network
- Instant Articles
- In-stream Video
In a Facebook experiment, marketers at Scoro discovered that Desktop ads had a 534% higher Cost Per Click than the ads displayed on Mobile + Audience Network.
However, they also discovered that Desktop ads performed a lot better in terms of conversions (Which made all the cheap Mobile clicks kind of useless).
Ad placement and offer match are the main reasons costs per conversion can differ so much. Offers demanding more effort or commitment might be dismissed on Mobile as too time-consuming and complex.
Testing placements can give you even more insight into reaching your target audience. For example, MOO ran ads in at least two locations, including the main News Feed:
And here’s a right-hand column ad:
Facebook Split Testing has an option for setting up split tests with various ad placements. Simply select a different placement for both ad sets and you’re good to go.
If you’re unsure which ad placements to use, here’s what Facebook suggests:
- Brand awareness: Facebook and Instagram
- Engagement: Facebook and Instagram
- Video views: Facebook, Instagram and Audience Network
- App installs: Facebook, Instagram and Audience Network
- Traffic (for website clicks and app engagement): Facebook and Audience Network
- Product catalog sales: Facebook and Audience Network
- Conversions: Facebook and Audience Network
Best practices for A/B testing Facebook Ad placements
- Don’t let your ad placements overlap — each tested ad set should have a different placement.
- Mind that your ad copy will vary across different placements, News Feed ads containing the most text.
- Don’t mess up your test results by changing other ad elements.
- Use ad images that look good across all devices and placements (e.g. avoid on-image text in tiny font that’s illegible in small-size ads).
3) A/B test bidding methods
Facebook ad bidding isn’t just about placing manual bids. It’s a mixture of budget setup, ad delivery optimization, payment options, and manual bidding.
And nobody knows how it really works.
Just kidding! If you haven’t yet read our ultimate Facebook bidding guide yet, here’s your chance.
Various ad delivery optimization settings can give your campaigns a staggeringly different reach and ROI.
For example, for a daily budget of €30, one could get reach 1,700 – 4,400 people out of 100,000 when optimizing ad delivery for Impressions.
Or reach 3,800 – 9,900 people when optimizing the ad delivery for Conversions.
An AdEspresso bidding experiment tested four different Facebook bidding methods: Cost Per Click, Cost Per Mile, optimized Cost Per Mile, and Cost Per Acquisition.
The results were pretty staggering. There were over 3,000% differences in the ad groups’ reach, impression count, and CPC.
While this was only a small experiment with a fairly low budget, it illustrates the opportunities of experimenting with Facebook bidding.
3 Tips For A/B testing Facebook bidding
- Create a separate ad set for each tested bidding method.
- Start by testing ad delivery options or experiment with manual bidding vs. automatic bidding.
- Before testing, set a preferred Cost Per Conversion for reference, just in case both bidding methods return suboptimal results.
4) A/B test ad types
Facebook has tons of different ad types. And each has a different display size, amount of ad copy, nature of ad image, ad placement, etc.
Depending on your offer, you may want to test these various ad types. However, you’ll want to keep the ads’ style and tone of voice as close to the original as possible. You don’t want other parameters to affect your test results.
Newsfeed ads are usually the first choice of Facebook ad beginners. And not just because they’re simple to create and set up. (Although that helps.)
Right-hand column ads
This is one of the most basic types of Facebook ads. Just a headline, a description, and a single image. You can only see these ads on your Desktop.
Facebook Lead Ads give people a quick way to opt into things like eBooks, newsletters, quotes, and offers straight from their mobile devices. You can usually recognize a lead ad by the “Download” call-to-action button.
Also known as Multi-Product Ads, this ad type showcases up to ten images and links in a single ad.
Dynamic product ads / DPA
These remarketing ads target users based on their past actions on your site.
Page like ads
This Facebook ad type’s goal is to get more likes to your brand’s Facebook Page. If you’re after conversions, aim for other ad types.
Mobile-optimized and animated Canvas ads help to tell your brand’s story with eye-catching content.
Mobile app install ads
This ad type helps to promote your app and have people install it on mobile.
Facebook recently announced that advertisers can now add GIFs to video ads. Why not give it a try?
Tip: There’s the right time and place for each ad type. When considering different ad types, think about it this way: “What’s the best ad format for catching my target audience’s attention and presenting my value offer?”
5) A/B test the ad design
A study by Consumer Acquisition found that ad images are incredibly important. In fact, they’re responsible for 75%-90% of ad performance.
When unsure which ad design to use, test up to five exceedingly different ad images to find the right direction for future designs.
Research has found that people make up their minds within 90 seconds of their initial interactions with either people or products. About 62‐90% of their assessment is based on color alone.
Creating more colorful ad images might help to get more people notice your ad, read it, and take your preferred action. Also, why not set up an A/B test with differently colored ad backgrounds?
Here’s another example of successful Facebook split testing:
Scoro split tested three various ad designs:
- Product screenshot with integration logos (Variation A)
- Product on a light blue background (Variation B)
- A stock image with text on it (Variation C)
Based on this intro, which do you think won the competition?
Variation A (the product screenshot) outperformed Variation C (the stock image) by generating a 220% higher click-through rate and a 146% higher cost-per-click.
Six ideas for A/B testing your Facebook Ad design
- Stock images vs. custom designs
- Test ad images with various color combinations
- Your product vs. a general image
- Text on image vs. no text on image
- High-contrast vs. low-contrast ad designs
- Test a reversed version of your ad image vs. the original
For more Facebook ad design inspiration, check out the ultimate showdown of 32 top-notch Facebook ad examples.
6) A/B test images vs. videos
Facebook videos are believed to have higher click-through rates and lower CPC than image ads — or could it be just a hype?
A Kinetic Social report found that video ads have the lowest eCPC, with an average eCPC of $0.18.
But you can never tell for sure unless you’ve experimented with Facebook video ads or posts on your own.
Tip: When creating advertising videos, avoid the top four reasons for low video engagement, according to Social Media Examiner:
- Including an intro
- Using logos or credits at the beginning
- Trying to tell too much in the video
- Having a person talking to the camera without context
You can set up an A/B test of image vs. video in the Facebook Ads Manager easily by replacing the image with video in one ad set variation.
7) A/B test the ad copy
Getting people to notice your Facebook ad in their Newsfeeds is only the halfway success towards getting them to click.
The next challenge that you can take on in your Facebook ad testing will be convincing them with your ad copy.
Facebook allows advertisers to customize every part of the ad text. This means that you’ll have a chance to split test your ads’ main text, headline, and link description.
For example, Monday (formerly DaPulse) has experimented with different ad copies while maintaining the initial ad design:
Tip: We recommend that you A/B test your headlines early and often, as they’re the first lines of text catching the readers’ attention.
According to the results of a computer science study from Columbia University and the French National Institute, 59% of people actually never read more than the headline of a Facebook post before sharing or liking it.
What to test in your ad copy
- The length of your ad copy
- Including exclamation marks or questions
- Adding emojis to your ad copy
- Split test listing various product benefits
- A/B test limited time offers or various prizes
- Test mentioning your product’s price inside the ad
- Experiment with odd and even numbers when sharing list posts
Facebook ad copy testing can also be used for finding highly engaging headlines for your blog articles. Simply split test 3-5 different ad headlines to see what makes the most people click.
For tens of Facebook ad copywriting tips, take a look at these 47 Facebook ad tips.
8) A/B Test the Value Proposition
Your Facebook ad is like a candy wrapper. It tempts people to click on your ad to discover the sweet deal on the landing page.
In simple words, the highest purpose of your Facebook ad’s copy and design should be presenting your value proposition in the most compelling way possible.
But what if you’re unsure what your most compelling value proposition is?
A/B testing isn’t the answer to everything in life. However, it could be the answer to the confusion over your UVP (Unique Value Proposition).
Check out this example by CoPromote:
In the first ad, their unique value proposition is “Cross-promote with creators on Twitter, Tumblr, …” In the second ad, they use the UVP of “Reach 500 000 new people per month”.
Split testing multiple value offers can produce fascinating insights, such as:
- Which offer is more compelling to the target audience, making them click on the ad.
- Which UVP has a higher ROI, leading by the number of conversions and sales.
Tip: The best place in ad to test your UVP is either in the headline or the main text — wherever the text is more prominent and seen by most people.
9) A/B test calls-to-action
While “Learn More” is the most widely used Facebook ad call-to-action…
It’s not necessarily the best choice in terms of ROI.
By conducting an A/B test, Scoro found that while ads with the Learn More CTA had a higher CTR, it was the Sign Up CTA that had a 14.5% higher conversion rate.
The best practice is to use the call-to-action that best describes what action you want people to take. (For example, if your CTA is “Download,” almost nobody will convert if you ask them to sign up for a service.)
Calls-to-action can be tested on the Ad level, meaning that you can change the variations when editing specific ads.
4 tips and ideas for split testing calls-to-action
- Test 2-3 CTAs that are closest to your desired action.
- Don’t limit yourself to the CTA button, test CTAs in ad headlines.
- Use action verbs to make your CTAs more actionable.
- Avoid calls-to-action that do not match with your ad’s landing page offer.
10) A/B test landing pages
Even after a person has clicked on your Facebook ad and started their journey onto your landing page, many things can go wrong.
What if they don’t like your landing page design? What if they fail to grasp the headline that seemed so clear and logical to you?
And how can you tell whether to keep your landing page copy super short like SUMO…
Or animated like Typeform…
Or aim to increase your website conversion rates by extending your landing pages with extra copy and images like Moz, increasing the sales of Moz’s PRO memberships by 51.83%.
If you have a landing page that’s specific to your Facebook ad campaign, set up an A/B test to see what works.
(If you want test your landing pages, not matter the traffic source, Facebook experiments are not going to cut it. In this case, you’ll need more comprehensive usability testing tools.)
To set up a Facebook ad test with multiple landing pages, create several ad sets and change the in-ad links.
Quick wrap-up on Facebook Ad testing
Now that you’re full of the A/B testing magic, don’t let it fade away but make up your mind about what you’re going to test next.
Even if you forget the rest, remember these three key takeaways:
- Avoid overlapping tests and testing too many things at once.
- Assign your Facebook tests enough time and budgets to deliver relevant results.
- Avoid measuring vanity metrics and focus on the cost-per-result.
And if you’ve got some cool test results you’re itching to share… We want to hear about them!