Search

Facebook Advertisings A/B tests - Dos and Don’t


Facebook Advertisings A/B tests - Dos and Don’t


The past week, I’ve been doing the A/B testing mastery course on CXL.com by Ton Wesseling. The course mostly focuses on A/B testing for websites. However, I took some of the general guidelines and information from the course and started applying them to my past Facebook A/B tests. I realized how in my previous tests the chances of me naming a false winner were really high because I hadn’t taken into consideration so many factors – I just went ahead with the guided Facebook A/B test set up – And Boy, have I now come to realize just how wrong that was. Why run A/B tests/ split tests in the first place?


The number of permutations and combinations of setting up a Facebook ad is endless (or at least feels endless). And unfortunately, there’s no magical shortcut to knowing what is the best-performing combination for your brand/business. There’s no sure-shot way of knowing what bid strategy, ad design, copy, offer or target audience would work for your product/service. The only possible way for you to learn what is working and what is not is by split testing or A/B testing your Facebook ad strategy.


Instead of assuming and tailoring your ads by what you think your customers would like and that being the end of the discussion, Facebook lets us set up multi-variable tests to see which ad variations actually work.


But before you jump into setting up your A/B tests, here are a few Dos and Don’t to consider:


1. Do not test if you do not have enough conversions:


The ideal number of conversions required before running A/B tests, from what I’ve recently learned is 1000. You could run A/B tests for smaller conversion numbers but the chances of you making a type 1 error – a false positive – is really high. You can refer to this blog post on CXL.com for how many conversions do you really need?


With Facebook, you can test for micro-conversions such as clicks, landing page views, CPM, CPC, and even leads ( if you’re using the lead generation campaign). You could also test for purchases and website leads, however, the number of variables to consider between someone viewing the ad and making a purchase is much larger. For example – is the buyers' journey simple? Is the checkout form too long? Are pages not responsive to that person’s device/browser? etc


However, if you’re using a website with a tried and tested theme/design (such as Shopify and the Shopify themes) then you don’t have to worry about those variables.


2. Do not test if you do not have a sufficient budget:


Adding to enough conversions is a sufficient budget. At the time of writing this, I only have 1 large business client that I generate leads for using Facebook advertising. The requirement is 5000-6000 leads per month. A sufficient number of conversions, yes, but a fixed limited budget that gets approved bi-annually. If I try to A/B test using the current budget they’ve allocated to Facebook Ads, I could be seriously hampering their daily business functions. The opportunity cost of the possible A/B test lowering their cost per lead by even 10% does not seem like a valuable trade-off other than not meeting their daily required target.


Why? Because their average order value is roughly INR 15,000, the average cost per lead is INR 60/-, any change below 25% wouldn’t really be worth them losing out even 1 possible order. And so, allocating the limited budget to A/B testing, currently wouldn’t make business sense.


3. Do your research:


Yes, your audience might be different, and it might not be the same product or service, but you could still learn a lot from reading up about A/B tests conducted by other brands or by just looking up your competitors. For example, you could find out errors in their methodology that you could avoid, you could learn to structure your tests better, you could find a new variable to test, or you could learn enough information about the variable tested that you would no longer have to test the same.


For example Image vs video ads – there’s enough information about this out there. The type of CTA to use based on where your customers are in the buyers’ journey, the information already exists. The messaging? Look at your competitors’ ads, find one that has the best engagement. Look at the pain points mentioned in the comments of your competitors, it’s a gold mine.


4. Do test for 7 days and cycles of 7 days:


Unlike Facebook’s default setting of running a test for 4 days or until an 80% statistical significance has been reached, you should test for an entire week. Why? Because the weekend, the beginning of the week, lunchtime, post-work, pre-work, etc are factors that influence any customer's mindset. So to ensure that these do not end up affecting your test result, you should carry out the test for 7 days. It Is also why you should not start or end the test in the middle of the day/week. If you haven’t received enough conversions in 7 days, then you must extend the test for another 7 days, even if you reach your target conversion numbers on the 9th or 10th day.


5. Do not test more than one variable in 1 test.


Do not mix testing your CTA with your creatives nor your ad messaging and audience together. Keep only 1 variable at a time. It’s obvious, right? But often so overlooked. While I was helping someone on Reddit with their Facebook ads, I asked him about the number of campaigns created, “they are all tests”, he said. I requested him to walk me through them so that we can avoid making the same targeting mistakes, and found that each ad set had different ads. Well, not obviously different ads, but they each had small changes such as fonts used, the call to action was different, and the headline text was different. Why? “To know which one was working”, he said. How do you know what's working, if you can not narrow it down to one single variable? That's why keep just 1 variable when testing.


6. USE THE FACEBOOK TEST SET UP if you are running tests:


In an ideal world, creating multiple ad sets/ads with 1 variable should be enough to give you your results, right? Unfortunately, Facebook is less than ideal. When you set up a test without telling Facebook it is a test – you have the problem of audience overlap. Facebook will show your ad to people that fall within both of your target audiences and show them both ads. If you tried to test an image with a video, without going through the A/B test set up, Facebook by default will end up showing only your video ad to people. If you try running an audience test without telling Facebook, you’ll be competing with yourself for the people who fall in both target groups – which will ultimately increase your costs. There’s no way to avoid this except by telling Facebook that the campaign is a test and to keep the audience separate and random. More about that here: Facebook - About A/B Testing


7. Do continue optimizing your ads even if you cannot run A/B tests:


A shocker, but it’s true. You can optimize your ads without having to conduct A/B tests. Testing will help you validate your hypothesis, but for creating better ads, sometimes all you got to do is just talk to a customer. You can also read this article here to know more about optimizing your ads and choosing tests that are worth your time and resources.


Take away:


Even though A/B tests are important, if you do not have the budget or the data to carry them out, or if the ROI doesn’t make much of a difference, then you’re better off investing in other methods. If you do carry out A/B tests, you must do so using the Facebook Test Set-Up, additionally instead of the default 4 days, you must run our test for periods of 7 days, to minimize the risk of a false winner.


If you found this article useful and would like to be notified when the next one goes live, do not forget to subscribe below.

5 views0 comments