Sign up for weekly recaps of the ever-changing search marketing landscape.
How To Conduct Ad Tests In Enhanced Campaigns
Enhanced campaigns have brought about many changes to AdWords. One of the biggest changes yet to be discussed is the fact that your ad testing methods will have to change.
One of the “features” of enhanced campaigns is that your campaign can now run on desktops and mobile devices with different CPCs that are controlled by bid modifiers. However, since your ads can be run on multiple devices at the same time, you need to test your ad metrics by device.
This can easily be accomplished with device preference and Excel filters. First, let’s discuss why this change needs to occur, and then, how to control the ad serving to ensure you are testing your enhanced campaign ads properly.
Why The Testing Change?
Let’s say we’re testing two ads and that we’re running both ads on all devices (desktops/tablets and mobile devices). What happens is that after a while, we check our metrics and we see data that looks like this:
If you simply used this data as-is, you would assume that Ad 1 is the best ad overall and go with that ad.
However, averages hide all the useful data. You need to segment your data to truly understand what is happening. If you were to segment these two ads by device type, the data looks much different:
In reality, Ad 1 is not the best ad — it is the best ad on mobile devices. The best ad on desktop devices is Ad 2.
Therefore, you’d now want to control which ad shows on which device, and this can be accomplished with device preferences.
When you create a text ad, you can specify the device preference:
If a campaign is set to show on all devices, and you have not set a preference by ad, your ads will be shown on all devices.
If a campaign is set to show on all devices, and all your ad preferences are set to mobile, your ads will be shown on all devices.
To control the ad serving by device, you need both a mobile preferred ad and a non-mobile preferred ad in each ad group. To test ads by devices, then you need at least two mobile preferred ads and two non-mobile preferred ads in each ad group.
Image Ad Preferences
In “legacy” campaigns, most sophisticated accounts would segment their display advertising from their search ads, and their mobile display campaigns from their desktop display campaigns. Because these campaigns were already segmented by device, most marketers would just upload “mobile” ads to their mobile campaigns and desktop sizes to their desktop campaigns based upon Google’s sizes:
However, several of the sizes that are not traditionally considered mobile ad sizes can be shown on mobile devices:
Therefore, you will also want to specify the mobile preference of an image ad so that you can test your image ads by devices as well as your text ads.
An Easy Way to Determine Ad Types by Device
In the AdWords interface, it is not easy to see if you have a mobile and non-mobile preferred ad in each ad group. The easiest way to see this data is to use a pivot table and conditional formatting.
In this case, a simple pivot table was used to show the number of ads by device preference in each ad group; and then, conditional formatting was applied to highlight any cell that was less than 1.
If you wanted to make sure you were testing in each ad group, you could also highlight all cells with less than 2 ads using conditional formatting. This would allow you to see which ad groups need ads created so that you can test them.
Run Your Statistical Confidence Numbers As Normal
Once you have the ads set up and running by device, you can do your statistical confidence calculations and pick your winner — just make sure to segment the information by device.
Only use your mobile information to test your mobile ads and pick winners.
Only use your desktop information to test your desktop ads and pick winners.
Once the data is segmented by device, the way you run your numbers and pick winners will not change with enhanced campaigns.
A “Cheater’s” Way Of Testing
Creating thousands of new ads can be a daunting task — so, there is a shortcut you can use. However, please note that, as with any shortcut, there are some underlying weaknesses.
Instead of creating ads for every device type, if your landing pages have the same content (such as with responsive design) and if overall conversion actions by device are the same, then you can start with just ads on “all” devices. You can then segment the data by device type and run your statistical confidence by device.
Once you have a winning ad by device, then you can change the ad’s preference type of mobile if it’s a mobile winner and leave the desktop winners as all devices.
There are a few inherent weaknesses to this approach:
- You cannot customize the call to action by device
- When you “edit” your winning mobile ad, it must go back under review and the stats are “reset” for the ad
This isn’t an ideal long term solution; but, if you are trying to transition many campaigns and thousands of ads to mobile devices, it can be a way to start ad testing.
However, with a “good” transition, you will keep your mobile ads in your enhanced campaign by moving the mobile ads to your desktop campaigns (or vice versa) and using ad preference to keep them segmented.
Enhanced campaigns are a major change to managing AdWords. However, they do not change the underlying principles of ad testing. You must test ads — and a good ad test will not only examine the differences in multiple ads, it will also take into account segmented data such as the device where the ad was displayed.
By ensuring you are controlling your ads displayed by device type, you can be confident in your ad tests and ensure that you are keeping the best ad for your account.
Even with device segmentation, many of the previous columns on ad testing are still true – they just require a previous step – device ad control. You can still easily manage and test millions of ads and use cross ad group testing principles.
Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.