PPC Mad Scientists Prove Google Right… And Wrong
At the SMX Advanced 2010 conference in Seattle, we invited four leading paid search experts to share some of their recent experiments in paid search in a new session entitled The Mad Scientists of Paid Search. The panelists were encouraged to add a bit of academic rigor to their presentations by including their hypotheses, design of the experiment, data, results and conclusions. The audience was invited to participate as peer reviewers, to challenge the assumptions, data integrity and conclusions of the scientists.
For those of you who were not able to make this year’s SMX Advanced, which sold out weeks in advance, I’d like to highlight the experiments presented by three of our PPC mad scientists, Wister Walcott, Addie Conner and Lulu Gephardt. Our fourth panelist, Dr. Siddarth Shah of Efficient Frontier, did not present an experiment, but instead led us through an engaging, professorial dissertation on portfolio management.
Google proven right
One of the great ongoing debates within the PPC community is whether conversion rate varies by ad position. In most paid search auctions, the higher your position (further “north”), the more clicks you will get. But what about the conversion rate?
Marketing lore holds that visitors with purchase intent will read all the ads more closely, but that “rubberneckers” will be more likely to click on just the ads at the top. If true, this would result in a better conversion rate as ads move down the auction. The thoughtful marketer could leverage this information to their advantage when maximizing profit.
Wister Walcott, co-founder and VP of Products at Marin Software, working with his team of in-house PhDs, designed an experiment that looked at the relationship between ad position and conversion rate. In an experiment that incorporated several hundred thousand data samples, Walcott isolated high-volume keywords that changed position from one day to the next, adjusted for day-of-week differences, and evaluated the change in conversion rates. As control, they first confirmed that conversion rates were stable when position stayed the same (it was), and then looked for a change in conversion rate when ad position changed.
Their findings? There was no significant correlation between ad position and conversion rate in eleven of twelve data sets tested. In just one data set, a lead generation site, the PPC scientists at Marin observed conversion rate declining when position went south. Perhaps more cautious clickers were less likely to fill out a lead form?
The Marin results supported the same findings published by Google Chief Economist, Hal Varian and his team last year (see his Inside AdWords post for more details). Wister cautioned that their study covered high-volume terms only, and since they found one exception, there are most likely others. If you’d like to conduct a similar test for yourself, Wister will send you the model they used in their study—ping him at firstname.lastname@example.org
Google proven wrong
Addie Conner, of Avenue 100 Media Solutions, developed some experiments to help understand the quality score implications of consolidating and reorganizing accounts. The experiment flowed out of her need to reorganize over 100 different AdWords accounts for a single URL into a new account or a new set of accounts.
Addie wanted to know whether it is always true, as Google suggests, that AdWords maintains the history of domains, keyword and ads. If this were true, Addie hypothesized, then there should be little or no impact on Quality Score, clicks, impressions and CTR when reorganizing within the same account, or from multiple accounts into a different existing account—or even into an entirely new account.
Addie’s experiments were designed to answer these questions related to multiple account reorganization:
- What happens when you move keywords and ads within the same account
- What happens when you move keywords and ads to a different account (has history)
- What happens when you move keywords and ads to a new account (no history)
In the first experiment, her research showed virtually no difference in Quality Score when moving keywords and ads within the same account. Her other two experiments, however, demonstrated that Quality Score suffered significantly when ads and keywords were moved to a new or existing account.
Addie’s conclusion was that while it may be true that Google will maintain Quality Score history when moving from a single account to another single account, it is definitely not true that when consolidating and reorganizing multiple accounts into new or existing accounts.
Google display network experiment
Lulu Gephardt, paid search Manager for REI, designed a very practical experiment to see if it was possible for REI to wring more return on ad spend in the Google display network, (formerly known as the content network) without cannibalizing their well-established affiliate network. Her test was simple and worth repeating for other retailers.
Instead of avoiding affiliate sites with their ads, REI directly targeted their affiliate partner sites which were running AdSense with CPC-based text and banner ads. The results were pleasantly surprising. Instead of cannibalizing their own network, REI increased the effectiveness of both their own display advertising that of their affiliates. Lulu’s presentation, The One-Two Punch was as entertaining as it was informative, and worth reviewing.
Calling all PPC mad scientists
I suppose that to some degree, all of us in paid search are mad scientists. While many of our experiments are practically-oriented and primarily designed to improve campaign performance directly, sometimes we create tests because we are insatiably curious to find out all that we can about how paid search really works.
If you have some interesting experiments you’ve tried that you’d like to share with the world, let me know, and I’d be glad to share the limelight with you and your mad science in one of my upcoming columns. Maybe you’d even like to make a speaking pitch to present your research at one of the upcoming conferences, such as SMX East in October. If so, I would love to hear from you—drop me an email or leave a comment below.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.