An idiot’s guide to testing in paid advertising — Hallam

An idiot's guide to testing in paid advertising — Hallam

In the fast-paced world of paid advertising, success stories are often born out of meticulous testing and optimisation. In this blog, I explore why testing is so important to success, how to structure tests and interpret results, and ultimately what you can begin testing now to land you on a path to success.

Understanding testing and its importance

Testing, in the context of paid advertising, refers to the process of experimenting with different elements of a campaign to identify the most effective strategies. It’s not just about trial and error; it’s a strategic approach to refining your advertising efforts based on data and insights.

Why is testing so important? Consider advertising as a complex puzzle with numerous pieces: ad copy, visuals, targeting, and more. Testing allows us as advertisers to figure out how these pieces fit together for the best results. Without testing, we’re left in the dark, relying on assumptions rather than data to guide our decisions – which is a very dangerous place to be!

The success of testing in action

Imagine a scenario where a well-crafted advertising campaign was falling short of its expectations. The engagement was good but not great, conversions were flowing but not fast enough, and frustration was mounting. This was me with one of my biggest clients when I first got into paid advertising. 

In this particular case, I decided to conduct a series of tests to identify the weak links in the campaign. I experimented with different ad copy and tweaked my audience targeting – both providing small jumps in performance, but one test was the key to producing a drastically stronger performance.

The client in question was an ecommerce brand offering aftermarket parts and body kits for sports cars. As you can imagine, these were pretty expensive products in a heavily competitive industry. 

The main issue we faced was that products had strict selling prices set by the manufacturer, meaning we were not able to beat competitors on price or with discounts – all other companies selling the same product as ours had the same price, the same product images and were likely getting similar results. 

With this in mind, I proposed a test – we buck the trend with the one thing we could change: we trial a switch in product imagery from the clean, clinical product layouts on a white background to lifestyle images that show the products in use on a car. 

A body kit is a very visual product to sell so my theory was that we needed to test as much as we could to find the best way to drive performance, taking advantage of the elements we could change.

Left: product parts laid out on a white background
Right: lifestyle imagery showing products on the car itself

Above on the left you can see an example of what the clients’ shopping ads looked like when I first took over (they’re not terrible) in 2017. All shopping ads for the sector looked like this – with the product laid out on a white background – and on the right is an example of the imagery we trialled in our test. 

So, with a theory in place, we launched our test and we monitored one key metric as our success measure: return on ad spend (ROAS). We split the budget in half between old and new imagery and we launched campaigns for three months – the results speak for themselves.

Left: the imagery of the product laid out achieved a ROAS of 740%
Right: the lifestyle imagery achieved a ROAS of 1,260% - the clear winner

Structuring tests for success

Now that you understand the importance of testing, let’s jump into how to structure tests for success. It’s not just about randomly changing elements and hoping for the best. A well-structured testing strategy involves careful planning and execution. 

There are a few simple steps you need to follow.

Define clear objectives

Before diving into testing, identify what you want to achieve. Are you aiming to increase click-through rates, boost conversions or enhance brand awareness? Clearly defined objectives will guide your testing process.

Identify variables

Decide which elements of your campaign you want to test. This could include ad copy, visuals, audience targeting, or even the timing of your ads. By isolating variables, you can pinpoint what is driving changes in performance.

Determine test duration

Testing is not a one-time event. It requires time to gather meaningful data. Determine a reasonable test duration based on your campaign goals and industry benchmarks. Avoid making decisions based on short-term fluctuations – the magic words here are statistical significance!

Budget allocation

Allocate your budget strategically for testing purposes. Distribute funds among different test variations to ensure a fair evaluation. This may involve running parallel campaigns or allocating a percentage of your budget specifically for testing – I always suggest saving around 10% purely for testing!

Interpreting results

Once your tests are running, the next crucial step is interpreting the results. Effective interpretation goes beyond surface-level metrics and involves a deeper understanding of what the data is telling you.

Statistical significance

Ensure that your results are statistically significant. Small sample sizes can lead to misleading conclusions. Use statistical tools to validate your findings and avoid making changes based on random fluctuations.

Focus on key metrics

Identify key performance indicators (KPIs) relevant to your objectives. If you’re testing for click-through rates, focus on that metric. For conversions, prioritise metrics such as conversion rate and cost per conversion.

Avoid premature conclusions

It’s easy to get excited or discouraged by early results. However, avoid making premature conclusions. Allow your tests to run for the predetermined duration to gather sufficient data for a reliable analysis. The biggest mistake you can make here is jumping the gun and launching something that’s started off well but isn’t proven, or even worse, panic pausing a test before it’s had a chance to deliver.

Now that we’ve covered the fundamentals of testing, let’s explore three replicable examples that you can integrate into your own advertising strategy.

Three replicable test examples

Example 1: A/B testing ad copy

The setup: create two versions of your ad copy, keeping all other elements constant.

The objective: determine which ad copy resonates better with your target audience.

Metrics to monitor: track click-through rates, conversion rates, and any other relevant engagement metrics.

Interpretation: identify the winning ad copy based on performance metrics like CTR and CPC and implement the successful version in your campaign.

Example 2: audience targeting experiment

The setup: test different audience segments by adjusting demographic or interest targeting.

The objective: identify the most responsive audience segment for your product or service.

Metrics to monitor: evaluate engagement metrics, conversion rates, and the overall performance of each segment.

Interpretation: determine which audience segment yields the best results and refine your targeting strategy accordingly.

Example 3: visual elements testing

The setup: experiment with different visual elements, such as images or videos, in your ads.

The objective: identify the visuals that capture the audience’s attention and drive engagement.

Metrics to monitor: track metrics related to visual engagement, such as click-through rates and conversion rates.

Interpretation: select the visuals that contribute to higher engagement and incorporate them into your ongoing campaigns.


In the ever-evolving landscape of paid advertising, testing stands as a cornerstone for success. As you navigate the intricacies of your advertising strategy, remember that testing is not a one-time event but a continuous process of refinement. Embrace the data, learn from each test, and use the insights gained to propel your campaigns to new heights.

If you want any further advice when it comes to testing from our Paid Media team, then get in touch.

Source link