A/B testing for PPC campaigns involves running two variations of an ad to see which performs better over time. Here, we rounded up our most impactful A/B tests and shared the results.
When it comes to pay-per-click (PPC) campaigns, even the smallest changes can make a big impact.
Whether it’s a new call-to-action (CTA), ad copy adjustment, or landing page design, A/B testing helps advertisers craft campaigns that drive more conversions.
In this guide, we’ll share five A/B tests our team ran that significantly increased ROI for our clients.
What is A/B testing in paid search?
A/B testing for paid search is the process of creating two versions of an ad, showing them to different audiences, and comparing their performance to see which is more effective.
Also called split testing or multivariate testing, you can A/B test visual elements of the ad (like the copy), location (where the ad is shown), or bidding strategy.
This allows you to use real data to effectively optimize your PPC campaigns.
5 A/B tests that boosted ROI for our clients
From bidding strategy to audience signals, these five A/B tests are real examples of small ad variations that led to big results.
Test #1: Bid strategy
In PPC marketing campaigns, bidding strategies tell Google what your primary goals are and how much you’re willing to pay for a click.
Bidding strategies include:
- Manual cost per click (CPC) allows you to define the maximum amount you want to pay for each click on your PPC ads.
- Manual cost per mille (CPM) lets you set a fixed price for 1,000 impressions of your ad.
- Maximize clicks automatically adjusts your bids to secure the most clicks possible within your set budget.
- Enhanced CPC, or ECPC, fine-tunes your manual bids by raising them for clicks most likely to convert and lowering them for less promising ones.
- Target impression share shows your ad on the search page as often as possible within your budget.
- Automated CPM optimizes your bidding to maximize the number of impressions your ads get.
- Maximize conversions automatically adjusts your bids to get as many conversions as possible within your budget.
- Maximize conversion value aims to generate the highest possible revenue from the conversions your ads receive.
- Target CPA (tCPA) lets you decide how much you’re willing to pay for each conversion, such as a sale or a sign-up.
- Target return on ad spend (tROAS) is a smart bidding strategy that lets you set a goal for the revenue you want to get (target return) for every dollar you spend on ads.
- Portfolio bid strategies are useful when you want consistent results and metrics across multiple campaigns.
- Shared budgets let you distribute your daily budget flexibly across multiple campaigns based on their performance.
The thing is, you probably won’t know which bidding strategy is best when first setting up your campaign. This is where A/B testing comes into play.
How to A/B test bidding strategies
Start by defining what you want to achieve with your bid strategy testing, whether that’s increasing conversions, lowering cost per click, or improving return on ad spend.
A clear goal will help determine the right metrics to track. Then, select two bid strategies that align with your objective. From here:
- Create two identical campaigns: Create two campaigns with the same settings, keywords, and ads but different bid strategies. Or use the Google Ads Experiments Tool to split-test within a single campaign.
- Let the test run: To gather the most accurate data, let your campaigns run for a few weeks — without making any adjustments mid-test.
- Monitor your KPIs: Track the metrics that align most with your goals.
- Analyze and optimize accordingly: After several weeks, compare your data and use your findings to select the most effective bidding strategy.
Case study: tCPA boosts ad spend on a grant account
HawkSEM paid media manager Justin Rodriguez faced a challenge: increasing spend on a Google Ads grant account.
The Google Ad Grants program gives eligible nonprofits $10,000 monthly in free search advertising; however, “this client’s account would never use the full $10,000 that Google Grant provides,” says Rodriguez.
Despite multiple adjustments, the campaigns hit a ceiling due to the program’s max CPC restrictions.
With this, Rodriguez hypothesized that changing his bid strategy to tCPA could use the AI’s ability to “skip” Grant account prerequisites and allow for more spending.
Previously, the account had not been able to spend more than about $60 per day.
After the test? Rodriguez’s account accrued:
- 303% increase in spend
- 333% increase in conversions
- 7% decrease in CPA
“What surprised me most was that the positive performance of the campaign flew in the face of some of our best practices,” says Rodriguez.
“Normally, we would need to take steps before using a tCPA strategy, but based on the speculations and articles I found, I decided to give it a try since I felt like I had unsuccessfully ‘followed the rules.’”
Another thing that surprised Rodriguez? “How fast it took to the new bid strategy,” he says. “It was the only time the account fully spent the $10,000 allotted (and still the only month it did to date).”
“We are still using [this bidding strategy] today,” he says. “This experience taught me to be open-minded [about] not sticking to the normal process for the sake of testing.”
Test #2: Ad copy
Testing the messaging of your text ads can improve CTR and reveal how different demographics respond to specific CTAs, value props, or tones.
How to A/B test ad copy
First, determine what you want to improve with your test, such as click-through rate (CTR), conversions, or any key metric, and consider how your copy can best support that goal. Then:
- Create two different versions of one element: Craft two different ad headlines, for example.
- Split your audience: Set both ads to target a similar audience and run under the same campaign.
- Let your test run: Typically, a few weeks for the greatest statistical significance.
- Review your results: With your preset goals in mind, review your ads’ performance and determine which copy yielded the best results.
- Optimize and repeat: Implement changes to your ad copy and choose a new element to A/B test.
Case study: CTA-focused ad copy produces higher conversion rate and lower costs
HawkSEM client Nava Health is an innovative group of holistic health and treatment centers focused on integrative medicine to treat the whole body.
While HawkSEM had already helped the Nava team nearly double conversions and cut CPA by 40%, lead strategist Katie Blatman sought to increase their phone calls by testing new ad copy.
“At the time, the client captured leads two ways: either by calling the practice or submitting a form from their landing pages,” says Blatman.
“They converted more than 80% of leads that called in, compared to 40%-50% of form leads. So we came up with the idea of writing targeted ad copy encouraging users to call.”
From there? “We emphasized how easy it was to get started, with ad copy showcasing that all it takes is a two-minute call to get started.”
Here’s an example of the copy used:
A (control group)
Refresh & hydrate in just 30 minutes with our nutrient-packed IV therapies.
B (CTA-focused)
A 2-minute call could be the start of your journey back to feeling 100%. Call us today.
The results? A higher conversion rate with lower costs:
“While we supplied the additional headlines and descriptions to Google, we also ran those alongside ad components that had been gathering data for a while,” says Blatman.
“Google tends to prefer showing ads that it knows have converted in the past. But not only did we see more appointments booked on the phone, those became some of our highest performing ad assets,” she says.
And this strategy has continued to be fruitful.
“I use this strategy any time a client closes a decent amount of business over the phone,” adds Blatman. “Of course, we always want the user to convert, but sometimes you have to tell them how you’d like them to convert.”
Test #3: Video
Video ads are a highly effective way to engage with your audience — and if you’re running Performance Max (PMax) campaigns, they might be the ticket to boosting conversion rates.
You can determine this with A/B testing.
How to A/B test adding video to PMax campaigns
First, create your video assets. These should be around 15 seconds or less, enough time to communicate your key message but short enough to capture your target audience’s attention.
Also, ensure your control group’s (non-video) static assets, like headlines and images, are consistent with the video variant so you can accurately test the impact of video.
Then, inside Google Ads:
- Create two versions of the same PMax campaign: The “control” (non-video) and “variant” (video).
- Make sure all other settings are identical: From bidding strategy and ad copy to budget and schedule, both campaigns should have the same settings aside from video.
- Let the versions run for at least two weeks: All A/B experiments are most accurate with time, but PMax campaigns especially benefit, as they have an upfront learning period.
- Analyze performance: Compare your campaign performance to determine which was more successful according to your KPIs.
- Use the winning version going forward and optimize: If video proves to be the more effective version, optimize your campaigns and consider testing with new video versions.
Case study: Video increases conversion rate for PMax campaign
HawkSEM wanted to increase the conversion rate (CVR) for an ecommerce client’s Performance Max campaign.
The team tested adding an asset group with a product-specific video against the control asset group without video.
The video asset group saw:
- 132% increase in conversions
- 286% increase in ROAS
Test #4: Audience signals
While audience targeting is a massive component of all paid search ads, Performance Max campaigns are unique in that they use audience signals.
Think of audience signals as a starting point for targeting.
As an advertiser, you provide these “signals” as suggestions (i.e., a description of who you think is most likely to engage with your ad), and Google’s AI technology uses those descriptions to find those most likely to convert through automatic testing over time.
Testing audience signals can help your campaign reach an audience with a higher intent faster — and people who you (and Google) might’ve missed otherwise.
How to test audience signals in Performance Max
Performance Max campaigns are set up differently from traditional Google search ads. Unlike Google search, which organizes targeting by ad groups, PMax uses asset groups.
Instead of assigning keywords or audiences at the ad group level, you provide audience signals for each asset group.
So, to test a new audience signal, create a duplicate asset group with the new signal and let it run alongside your existing groups.
Here’s how:
- Select the Performance Max campaign you want to duplicate
- Go to “Asset Groups” inside your campaign
- Find the ad set you want to duplicate
- Click the three-dot menu, then click “Copy”
- “Paste” to duplicate the ad set within your campaign
- Rename the duplicated asset group
- Go to the “Audience” section and click “Edit audience signals”
- “Add a new signal”
- Define your new audience signal — choose from affinity, in-market, or life events, create a custom segment, or upload your own customer list
- Save
Case study: Competitor login audience signal test increases CVR & ROAS
HawkSEM Paid Media Manager Amy Owings wanted to increase the conversion rates (CVR) and ROAS for her clients’ PMax campaigns.
To accomplish this, Owings tested a new audience signal: people who visited account and subscription pages on competitor websites or searched for a competitor’s login (for example, “[competitor name] login”).
“We’d seen general competitor audiences work well for ecommerce,” says Owings.
“We were curious if a heightened version of that — people who had logged into accounts on competitor websites, therefore loyalists — would also work well.”
The results were promising:
In three out of four tests, this led to increased ROAS and CVR. The strongest performance came from consumer packaged goods (CPG) and appliance campaigns, while skincare campaigns showed less impact.
With this data, the new audience signal was added as a unique asset group within each campaign.
“I was surprised how quickly and significantly this audience lifted CVR in PMax,” says Owings.
Test #5: Geotargeting
Geotargeting lets advertisers control where ads appear based on location.
With geotargeting, advertisers can target audiences in specific countries, regions, states, cities, or neighborhoods.
It also allows them to include localized ad copy and elements like addresses or phone numbers — and even adjust for seasonality and weather conditions.
There are two primary geotargeting options:
- Presence Only: Targets people in a physical location
- Presence or Interest: Targets people who are in the physical location or show interest in the area — even if they’re not physically there. Google Ads typically applies this broad geo targeting setting in the location you’ve set by default.
How to test geotargeting
First, decide what goals are most important to measure — click-through rate, conversions, ROAS, or another KPI. Then:
- Choose your targeting variations: Presence Only or Presence and Interest. You could also test different regions or radius targeting for local campaigns.
- Set up duplicate campaigns or ad sets: Keep everything else identical and only change the geotargeting.
- Run the test for a sufficient time: Avoid making mid-test changes that could skew results.
- Monitor performance: Track metrics tied to your goal—clicks, conversions, ROAS, or CPA. Compare results between the two geotargeting strategies.
- Analyze and implement: Identify which geotargeting option performs better. Apply the winning strategy across other campaigns or asset groups, and repeat the test with new variations if desired.
Case study: Expanded geotargeting boosts conversions and lowers costs
HawkSEM client Tricon is a rental housing company with locations throughout Canada and over two dozen U.S. states.
SEM manager Brad Williams wanted to see if expanding the geotargeting setting presence and interest improved performance for three of their accounts.
While most clients are best served by Presence only, Tricon operates in the housing space, where potential customers may be searching from outside the immediate area. This made it an ideal opportunity to explore broader targeting.
The results were promising.
Two accounts (Bella View and East Ridge) saw significant improvements, including higher conversions, improved CVR, more efficient CPA, and lower CPC.
Meanwhile, the third account (Stapleton Park) showed neutral results, roughly a tie between the two settings.
Other common A/B tests for PPC
Looking for more A/B testing ideas? We’ve got you covered. This guide walks you through the top 16 ideas to refine your user experience and boost conversions, including:
- Product image variations
- Price point testing
- Checkout process optimization
- Cart abandonment strategies
- Subject line variations for email marketing
- CTA button designs
- Personalization techniques
- Timing and frequency
- Different headlines
- Page layout testing
The takeaway
No matter which ad formats you use, A/B test results allow you to make data-driven decisions for successful ad campaigns.
As leaders in the digital marketing space, we’ve navigated ways to boost our clients’ ROI through effective testing processes — from SEO to social media marketing campaigns to PPC.
If you don’t have the time or expertise to tackle testing yourself, reach out to our team of experts to see how we can increase your ROAS and build a custom marketing strategy to hit your goals.
From retailers to SaaS and every industry in between, we’re here to help.
This article has been updated and was originally published in December 2024.