An A/B test for PPC marketing campaigns is the process of creating two slightly different versions of an ad and comparing their performance after several weeks. Here, we rounded up our most impactful A/B tests and shared the results.

From search engine optimization (SEO) to social media marketing to email marketing, A/B testing is a crucial tool to sharpen your marketing campaigns and boost conversions.

Pay-per-click (PPC) campaigns are no different.

Strategic testing can yield a significant boost in your return on investment (ROI), even for those with restrictive budgets like Google Ads grant accounts.

In this guide, we’ll outline four examples of real tests our team conducted that significantly increased ROI for our clients.

What is A/B testing in paid search?

A/B testing for paid search is the method of creating two versions of an ad, showing them to different groups, and comparing their performance to see which is more effective.

Also called split testing or multivariate testing, you can A/B test visual elements of the ad (like the copy), location (where the ad is shown), or bidding strategy.

This allows you to use real data to effectively optimize your PPC campaigns.

4 A/B tests that boosted ROI for our clients

  1. Bid strategy
  2. Ad copy
  3. Video
  4. Audience signals

A/B testing is one of the most powerful tools paid search marketers can use to increase conversion rate optimization (CRO) and, as a result, ROI.

This guide walks through some common A/B testing ideas for a wide range of digital marketing strategies — from ecommerce-specific marketing to email testing.

But when it comes to ad campaigns specifically, there are a few tests that proved to be especially successful.

Test #1: Bid strategy

Bidding strategies are the goals you provide your advertising platform. When setting up a campaign, advertisers provide a budget (how much they are willing to spend on each day) and a bid strategy that outlines the ad’s top priority.

The thing is, you probably won’t know which bidding strategy is best when first setting up your campaign. This is where A/B testing comes into play.

Choose a bidding strategy to try for a few months, then try another bid with a different goal. From there, compare the results and optimize.

The bidding strategies include:

  • Manual cost per click (CPC) allows you to define the maximum amount you want to pay for each click on your PPC ads.
  • Manual cost per mille (CPM) lets you set a fixed price for 1,000 impressions of your ad.
  • Maximize clicks automatically adjusts your bids to secure the most clicks possible within your set budget.
  • Enhanced CPC, or ECPC, fine-tunes your manual bids by raising them for clicks likely to convert and lowering them for less promising ones.
  • Target impression share shows your ad on the search page as often as possible within your budget.
  • Automated CPM optimizes your bidding to maximize the number of impressions your ads get.
  • Maximize conversions automatically adjusts your bids to get as many conversions as possible within your budget.
  • Maximize conversion value aims to generate the highest possible revenue from the conversions your ads receive.
  • Target CPA (tCPA) lets you decide how much you’re willing to pay for each conversion, such as a sale or a sign-up.
  • Target Return on ad spend (ROAS) (return on ad spend) is a smart bidding strategy that lets you set a goal for the revenue you want to get (target return) for every dollar you spend on ads.
  • Portfolio bid strategies are useful when you want consistent results and metrics across multiple campaigns.
  • Shared budgets let you distribute your daily budget flexibly across multiple campaigns based on their performance.

How to A/B test bidding strategies

First, identify what you want to achieve with your bid strategy testing. Perhaps you want to increase conversions or improve ad spend.

A clear goal will help determine the right metrics to track. Then, select two bid strategies that align with your objective. From here:

  • Create two identical campaigns: Create two campaigns with the same settings, keywords, and ads but different bid strategies. Or use the Google Ads Experiments Tool to split-test within a single campaign.
  • Give the experiment time: To gather the most accurate data, let your campaigns run for a few weeks — without making any adjustments mid-test.
  • Monitor your KPIs: Track the metrics that align most with your goals.
  • Analyze and optimize accordingly: After several weeks, compare your data and use your findings to select the most effective bidding strategy.
  • Case study: tCPA boosts ad spend on a grant account

HawkSEM paid media manager Justin Rodriguez needed to increase spending on a Google Ads grant account.

“This client had a grant account that would never use the full $10K that Google Grant provides,” says Rodriguez.

“We tried a few changes to the account, but there would always be a cap on the max CPC the campaigns would allow for.” With this, Rodriguez hypothesized that changing his bid strategy to tCPA could use the AI’s ability to “skip” Grant account prerequisites and allow for more spending

Previously, the account had not been able to spend more than about $60 per day.

After the test? Rodriguez’s account accrued:

  • 303% increase in spend
  • 333% increase in conversions
  • 7% decrease in CPA

“What surprised me most was that the positive performance of the campaign flew in the face of some of our best practices,” says Rodriguez.

“Normally, we would need to take steps before using a tCPA strategy, but based on the speculations and articles I found, I decided to give it a try since I felt like I had unsuccessfully ‘followed the rules.’”

Another thing that surprised Rodriguez? “How fast it took to the new bid strategy,” he says. “It was the only time the account fully spent the $10K allotted (and still the only month it did to date).”

“We are still using (this bidding strategy) today,” he says. “This experience taught me to be open-minded to not sticking to the normal process for the sake of testing.”

Test #2: Ad copy

When it comes to paid search ads, ad copy will make or break your click-through rate (CTR). But to craft the perfect copy, you often need to experiment with A/B testing to find the right match.

Like most A/B testing, start with only one element at a time — this can be the phrasing of your call to action (CTA) or your value proposition.

How to A/B test ad copy

First, determine what you want to improve with your test and consider how your copy can best support that goal. Then:

  1. Create two different versions of one element: Craft two different ad headlines or CTAs, for example.
  1. Use your PPC platform to split your audience: Set both ads to target a similar audience and run under the same campaign.
  1. Let your test run to accumulate data: Typically, a few weeks for the greatest statistical significance.
  1. Review your results: With your preset goals in mind, review your ads’ performance and determine which copy yielded the best results.
  1. Optimize and repeat: Implement changes to your ad copy and choose a new element to A/B test.

Case study: CTA-focused ad copy produces a better conversion rate

HawkSEM client Nava Health is an innovative group of holistic health and treatment centers focused on integrative medicine to treat the whole body.

While HawkSEM had already helped the Nava team nearly double conversions and cut CPA by 40%, lead strategist Katie Blatman sought to increase their phone calls by testing new ad copy.

“At the time, the client captured leads two ways: either by calling the practice or submitting a form from their landing pages,” says Blatman.

“They converted more than 80% of leads that called in, compared to 40%-50% of form leads. So we came up with the idea of writing targeted ad copy encouraging users to call.”

From there? “We emphasized how easy it was to get started, with ad copy showcasing that all it takes is a two-minute call to get started.”

Here’s an example of the copy used:

1. A (control group)

Refresh & hydrate in just 30 minutes with our nutrient-packed IV therapies.

2. B (CTA-focused)

A 2-minute call could be the start of your journey back to feeling 100%. Call us today.

The results? A higher conversion rate with lower costs:

image3

“While we supplied the additional headlines and descriptions to Google, we also ran those alongside ad components that had been gathering data for a while,” says Blatman.

“Google tends to prefer showing ads that it knows have converted in the past. But not only did we see more appointments booked on the phone, those became some of our highest performing ad assets,” she says.

And this strategy has continued to be fruitful.

“I use this strategy any time a client closes a decent amount of business over the phone,” adds Blatman. “Of course we always want the user to convert, but sometimes you have to tell them how you’d like them to convert.”

Test #3: Video

Video ads are a highly effective way to engage with your audience — and if you’re running Performance Max (PMax) campaigns, they might be the ticket to boosting conversion rates.

You can determine this with A/B testing.

How to A/B test adding video to PMax campaigns

First, create your video assets. These should be around 15 seconds or less, enough time to communicate your key message but short enough to capture your target audience’s attention.

Also, ensure your control group’s (non-video) static assets, like headlines and images, are consistent with the video variant so you can accurately test the impact of video.

Then, inside Google Ads:

  1. Create two versions of the same PMax campaign: The “control” (non-video) and “variant” (video).
  1. Make sure all other settings are identical: From bidding strategy and ad copy to budget and schedule, both campaigns should have the same settings aside from video.
  1. Let the versions run for at least two weeks: All A/B experiments are most accurate with time, but PMax campaigns especially benefit as they have an upfront learning period.
  1. Analyze performance: Compare your campaign performance to determine which was more successful according to your KPIs.
  1. Use the winning version going forward and optimize: If video proves to be the more effective version, optimize your campaigns and consider testing with new video versions.

Case study: Video increases conversion rate for PMax campaign

HawkSEM wanted to increase the conversion rate (CVR) for an ecommerce client’s Performance Max campaign. The team tested adding an asset group with a product-specific video against the control asset group without video.

The video asset group saw:

  • 132% increase in conversions
  • 286% increase in ROAS

image1

Test #4: Audience signals

While audience targeting is a massive component of all paid search ads, Performance Max campaigns are unique in that they use audience signals.

Think of audience signals as a starting point for targeting.

As an advertiser, you provide these “signals” as suggestions (i.e. a description of who you think is most likely to engage with your ad) and Google’s AI technology uses those descriptions to find those most likely to convert through automatic testing over time.

Testing audience signals can help your campaign reach an audience with a higher intent faster — and people who you (and Google) might’ve missed otherwise.

How to test audience signals in Performance Max

Performance Max campaigns are set up differently than traditional Google search ads. Unlike Google search, which uses ad groups, PMax uses asset groups (also called ad sets).

Instead of setting up keywords or audiences at the ad group level, you provide audience signals for each asset group.

So, to test a new audience signal, you simply create a new identical ad set with the new signal and let it run against the existing ad sets and signals.

Here’s how:

  • Select the Performance Max campaign you want to duplicate
  • Go to “Asset Groups” inside your campaign
  • Find the ad set you want to duplicate
  • Click the three-dot menu, then click “Copy”
  • “Paste” to duplicate the ad set within your campaign
  • Rename the duplicated asset group
  • Go to the “Audience” section and click “Edit audience signals”
  • “Add a new signal”
  • Define your new audience signal – You can create a custom segment, choose from affinity, in-market, or life events, or upload your own customer list.
  • Save

Case study: Competitor login audience signal test increases CVR & ROAS

HawkSEM Paid Media Manager Amy Owings wanted to increase the conversion rates (CVR) and ROAS for her clients’ PMax campaigns.

To accomplish this, Owings tested a new audience signal: people who visited account/subscription pages on competitor websites or searched for competitor’s login (for example, “[competitor name] login”).

“We’d seen general competitor audiences work well for eComm,” says Owings.

“We were curious if a heightened version of that—people who had logged into accounts on competitor websites, therefore loyalists—would also work well.”

In three out of four tests, this led to increased ROAS and CVR for her clients. We saw the strongest results for consumer packaged goods (CPG) and appliances, whereas skincare showed contrary results.

image2

With this data on hand, the new audience signal was added as a unique asset group within each campaign.

“I was surprised how just quickly and significantly this audience lifted CVR in PMax,” says Owings.

image4

Other common A/B tests for PPC

Looking for more A/B testing ideas? We’ve got you covered. This guide walks you through the top 16 ideas to refine your user experience and boost conversions, including:

  • Product image variations
  • Price point testing
  • Checkout process optimization
  • Cart abandonment strategies
  • Subject line variations
  • CTA button designs
  • Personalization techniques
  • Timing and frequency
  • Different headlines
  • Page layout testing

The takeaway

No matter which ad formats you use, A/B test results allow you to make data-driven decisions for optimal PPC performance.

As industry leaders, we’ve navigated ways to boost our clients’ ROI through effective testing processes, from innovative bid strategies for limited budgets to creative audience signals that make an instant impact.

If you don’t have the time or expertise to tackle testing yourself, reach out to our team of experts to see how we can increase your ROAS.

Patience Hurlburt-Lawton

Patience Hurlburt-Lawton

Patience is a writer, editor, and educator. As a content marketing manager at HawkSEM, Patience leans into the power of empathy and understanding to create content that connects the dots. When she’s not a writer, she’s a singer/songwriter, trail romper, and adventure seeker with her wolfie dog, Jackson.