A/B testing compares two marketing elements to see which one drives the best results. The method? Test one variable at a time, analyze the data, and double down on results.

Here, you’ll find:

If you’ve been in the digital marketing game a while, you’ve probably spent countless hours (and a hefty chunk of your marketing budget) on stunning visuals, interactive lead forms, and on-brand web elements for your website or landing page

Yet, despite increased web visits, your bounce rate is through the roof. 

What gives? 

You may have placed too much effort into one marketing strategy without testing the waters. After all, you wouldn’t buy a car without a test run first, right? 

Enter A/B testing, a marketer’s best friend for pinpointing the best elements for a converting ad, web page, email campaign, or landing page

The trick to A/B testing: Test only one variable, focus on statistical significance over timelines, and act fast on results. 

Abigail Beene, one of HawkSEM’s talented in-house SEM managers with a history of growth marketing success, shares her expert tips on A/B testing below.

 What is A/B testing?

A/B testing, also known as “split testing” or “multivariate testing,” is a method of comparing two different versions of a similar marketing product to see which option better reaches or converts your audience. 

Think back to those “choose your own adventure books” from the ‘90s. You might not have picked that ominous tunnel in option B if you’d known it led to an abrupt end, right? 

A/B testing is like an insurance policy for your marketing adventure, giving you a preview of what could go wrong (or right) in two scenarios. 

At HawkSEM, we’re no stranger to A/B testing PPC ads and landing pages, and its uses are vast. You can apply it to many different variables in an ad campaign, including creatives, copy and headlines, keyword match types, and landing page variations. 

But the possibilities don’t end there. You can also conduct A/B tests for:

  • Email campaigns (test email subject lines, visuals, copy, etc.)
  • Moving elements on your website for user experience (web copy, images, fonts, SEO keywords, etc.)
  • Conversion rate optimization (CRO) goals across different creatives or elements

So, what benefits can A/B tests offer your brand? 

a/b testing scale visual

 Why brands need A/B testing

Your audience is the heartbeat of your brand. You invest time and resources into understanding them through social media listening, surveys, and competitor analysis. Beene describes A/B testing as yet another important means of gathering audience insights. 

A/B testing is really beneficial for companies of any size,” she says. “Testing allows brands to learn what tactics perform best with their audience. They can then take those learnings and apply them to future decisions, taking the guesswork out of finding what resonates with their users.”

And when you finally find what resonates? The results come pouring in. Take our client TimeWarp Trading, for example.

They approached us seeking more targeted leads and webinar sales through their landing page. We got to work studying their audience of professionals to craft a brand-new, optimized landing page that converts. We A/B tested every element on that landing page to deliver the results they craved.

Here’s what those insights helped us accomplish:

  • 30% increased conversion rate
  • 471% increased return on ad spend (ROAS)
  • 5% of sale cost per acquisition (CPA)

Dive into the full case study here.

Want to see the same results? Here’s your guidebook:

 How to conduct an A/B test

By the end of this section, you’ll know how to produce a seriously valuable insight for your marketing strategy: which product brings more revenue. 

But to get those test results, you need to set some parameters, like goals, testing variables, and sample sizes. 

1. Define your goal

Your goal should be specific and measurable. Additionally, you should be able to compare it to your metrics at the end of your split test.

Maybe you want more website visitors to fill in your lead form to heighten your newsletter subscriptions. Or, you’re looking for less cart abandonment for a particular product. 

Here are some real-world examples to use as a baseline, as well as the results we achieved for our clients after A/B testing.

4 A/B testing examples in action 

Boosting YOY revenue for 686

Our ecommerce winter apparel client, 686, struggled to define their unique selling points and customer personas. We analyzed and re-optimized their product feed and Google shopping campaigns, and tested new high-res photos and prices to see how they performed.

The data showed us which campaigns attracted the most high-quality leads. We launched and tested remarketing campaigns, adding additional touch points to better reach already interested customers. 

Results: A whopping 562% increase in year-over-year revenue.

Getting more leads and staying within budget for Peninsula General

Our insurance client, Peninsula General, wasn’t content with their 15% ROAS. To reduce their CPA dramatically and rev up the ROI, we created and tested unique selling points and customer personas their competition had missed.

We A/B tested new ads to see which clean, user-friendly designs drove the most conversions. 

Results: Doubled ROAS, all while staying within budget.

Bringing in the leads for Zephyr

SaaS platform Zephyr wanted to scale their market share and increase their user base. The problem? Their fast-paced environment with siloed marketing teams created mixed messaging.

We knew immediately that the data would talk. The first thing it told us was that they were draining leads from an ineffective landing page. 

By redesigning and A/B testing the landing page, we got the data needed to streamline the lead form and optimize the landing page to operate more efficiently.

Results: A 100% increase in lead volume and 80% reduced CPA.

Increasing mobile conversions for Swiss Gear

Ever wish you had magical powers and could see exactly where your site visitors click on your website? With Hotjar’s heatmaps, you wield that magic. When SwissGear wanted to boost mobile conversion rates, they recruited pros to analyze the data and see where people were bouncing.

Next step? Use Hotjar Heatmaps to reveal visitors’ scroll and click behaviors and test out why and when they left the site. After each test, they tried new improvements, gleaning more data to refine and test, refine and test, and refine and test again.

Results: Conversions skyrocketed by 132%.

So, what can you test that will help you reach a defined goal? You might have a hunch that it’s your broad keyword match types keeping you from qualified leads. That’s one variable you can test. 

2. Pick one variable or element to test

We know just how much goes into an ad campaign. From catchy headline copy to eye-catching landing page visuals, you have abundant variables to A/B test

Here are some of our favorites to test across different landing pages, email marketing assets, or web pages:

  • Images
  • Homepage slogan or messaging
  • Keywords and keyword match types
  • Videos or illustrations
  • Infographics
  • Ad copy
  • Headings
  • Meta descriptions and snippets
  • Website copy
  • Fonts
  • Colors
  • CTAs
  • Buttons 
  • Lead form items

So, how do you decide which elements to test? According to Beene, it depends on your brand’s priorities: 

“Landing pages are a great place to start with A/B testing,” she explains. “You can split your campaign to where half the ads direct to one variation and the other half direct to a different variation.”

Just make sure not to vary the two versions too much. You want clarity on which element influences your audience’s behavior, so Beene advises against variations with drastically different elements. 

The same goes for ad copy. While you might want to test out different CTAs, messaging, and unique value propositions (UVPs), stick to testing one variable at a time (more on why later). 

Beene favors UVPs as an ad copy-testing element: 

“I’ve found that it’s really beneficial for brands to test different value propositions in their ad copy to have an idea of what aspects of their product or service are really resonating with their audience.”

3. Establish a sample size

User behavior isn’t uniform across your entire audience. Meaning? A couple of conversions in option A of your split test don’t necessarily reflect how your broader audience will behave. 

For example, getting two purchases directly after launching your new landing page doesn’t necessarily mean that will be the pattern going forward. Why? Because two data points aren’t enough to establish a pattern in the first place.

In this case, a small sample size won’t extract accurate results and insights for your campaign. 

Your sample size should garner “statistically significant results,” meaning the results should come from a specific reason, not mere chance. This ideal size can look different for each business, but you can analyze your weekly web visitors and traffic specs to determine yours.

Beene also recommends A/B testing tools like Optimizely or Qualtrics to plug in those traffic specs and help you calculate an appropriate sample size for your test automatically.

Next order of business: your deadline.

4. Identify your timeframe

It’s important to be realistic about timelines for A/B testing. Two weeks is a good starting point for A/B testing, but not every business will fit this schedule.

“[Timelines] will vary for every company,” Been explains. “A smaller start-up may not have the time or resources to put toward more expensive testing.” 

She adds that, while SMBs can still A/B test, they will likely have to use smaller budgets and make conclusions more quickly, even if that means sacrificing the quality of data.

How do you know if you’re wasting too much of your budget versus collecting just enough data? We won’t sugarcoat it, finding the right balance can be tough. It’s something that comes with experience. 

Some marketers recommend halting or waiting to test if you’re working with less than 1,000 monthly conversions. But sometimes we can gather enough data with less if the results demonstrate statistical significance.

Our skilled PPC experts? They’ve got decades of industry knowledge and experience to assess a reasonable, budget-aligned time frame for your A/B tests.

5. Assess A/B test results

Fast forward to the end of your A/B testing period. It’s time to dive into the test results! Examine how the different variations measured up against your initial metrics and goals.

“Consider the difference in performance between the two variations when assessing your A/B test results,” says Beene. “If one variation only slightly outperforms the other, you may want to consider some further testing or proceeding with caution when using those learnings.”

And if one variation outperforms the other in a landslide? That’s your clear winner.

“You can confidently determine that applying that strategy on a large scale in your campaigns should lead to improved results,” says Beene.

Alright, we’ve covered the bases. How can you set yourself up for the most success?

 A/B testing best practices

A/B testing helps you achieve a stellar landing page, ad, or web page that not only resonates with your audience but also drives conversions. 

Next, we share some of Beene’s pro insights for boosting your confidence level in A/B testing even more:

1. Only test one element at a time

What if you want to test out an ad keyword and call-to-action button at the same time? 

For instance, if you’re a DIY candle-kit ecommerce business and you test:

  • Version A: Yellow CTA button on the landing page with wax illustration dripping on the sides and “DIY candle building” as the ad headline keyword
  • Version B: Neutral CTA button without visual elements and “candle set” as the ad headline keyword

Let’s say Version A brings in more conversions. Yet when you pour your marketing budget into that option, your conversions still fall short. Why? Because you can’t be sure if it was the ad keyword or landing page design that piqued your target audience’s interest. 

This is why Beene advises against testing more than one element at a time:

“While it may seem more efficient on the surface to test a few different variables at once, when you look back at performance, there will be no way to clearly determine what actually moved the needle.” 

In other words? Multiple elements in an A/B test could muddle your data:

“You don’t have clear learnings to take away and use in future strategy decisions,” says Beene. 

Testing only one element at a time gives you clear, actionable insights into what works and doesn’t work for your audience. But it’s a delicate balance; you also need enough data to assess that element’s performance.

2. Focus on having enough data vs. a specific timeline

Most marketers swear by the two-week timeline as a minimum for A/B testing. While we’ve seen enough data within two weeks to gain actionable insights for a client, that’s not always the case.

That’s because it’s not so much about the length of your A/B test as it is about the quality and volume of data collected. 

“For all companies, it’s important to ensure you have enough data (whether that be from a large budget or a smaller budget over a longer period of time) to pull statistically significant findings, with a clear winner,” says Beene. “Otherwise, there is still some guessing involved when applying your findings on a larger scale.”

3. Act fast to avoid wasted ad spend

Let’s say you have the highest confidence level in your A/B test results. The only problem? You spent way more than you meant to in long-term A/B testing

Is there an ideal strategy to avoid wasted ad spend? 

“A great way to ensure you’re using budget efficiently while testing is to know when you have a clear winner from the test and act quickly,” says Beene.

“If you have spent significant budget on a test and one variable is outperforming the other, pausing the underperformer and reallocating budget as soon as you have those learnings is crucial.”

The takeaway

A/B testing is your ticket to unlocking data-driven insights that engage and convert your audience. It’s the secret sauce every marketer needs to optimize budgets, dodge wasted ad spend, and scale successful strategies for ultimate ROI.

But the hardest part about A/B testing is the time and effort it takes to collect and interpret results. On top of that, brands struggle to pick the right variables to test and key performance indicators (KPIs) to track. 

This is where the expertise and dedication of a top-3% digital marketing agency brings serious value. 

HawkSEM has decades of experience designing and conducting A/B tests for large and small businesses across the finance, ecommerce, travel, insurance, B2B, SaaS niches, and beyond. In other words, we know what to look for, and how to make the most of an A/B testing series. 

Ready to unleash the full potential of your A/B tests? 

Our PPC experts can lead the way with our meticulous A/B testing process, complete with user behavior insights from heatmaps and data-driven performance results from our proprietary tech, ConversionIQ

Want to know more about how we can take your efforts to the next level? Let’s talk.

Contact HawkSEM for Free Consultation