Google AdWords Geo-Targeting: Have We All Been Doing It Wrong?

google-geo-target-featuredIt’s rare that a true challenge to a best practice (and a seemingly straightforward one at that) arises.

But, when Marta Turek, Group Manager, Performance Media at Mediative presented case studies showing what happened when they tested an opt-out versus the traditional opt-in approach to geo-targeting in AdWords at SMX Advanced this year, jaws dropped. Eyes bulged. (Editor’s Note: Turek’s complete presentation is now embedded at the end of this piece, the author highlights key points throughout the article.) 

I recently caught up with her to talk more about their findings. As often happens with light bulb moments, Mediative didn’t set out to change the way they geo-target. The agency was running a state-wide campaign in Colorado for a large client. When the client wanted to run a campaign targeting Denver users only, Mediative took Denver out of the state campaign and set up a campaign to target just the city of Denver in AdWords.

What Happened Next Stunned Them

The team started to realize that in making the change to a city-targeted campaign, impressions and clicks in Denver actually fell — and fell hard. Over the 8 week test period of targeting Denver impressions fell off 21 percent. Clicks dropped by 57 percent.

Here’s Turek’s graph showing what happened to clicks in the 8 week test period in which they ran the Denver-only campaign and what happened immediately after the team added Denver back into the state-wide campaign.


The impression and click drop offs were startling, but the rise in cost-per-click (CPC) really surprised them. The client’s CPCs in Denver rose during the test period and then fell back by $3.39 (30 percent) almost immediately after Mediative added Denver back into the state-wide campaign. “We were just shocked at how much CPCs when up when we did that,” says Turek.

Adwords Opt Out Geo-targeting cpc declineTurek is quick to point out that this test wasn’t perfect and acknowledges the challenges inherent in doing sequential testing including behavioral changes due to seasonality, etc. However, she says the targeting was the one major factor that changed — there were no major ad copy and landing pages changes, and they didn’t see big swings in quality scores.

What’s Going On Here?

The opt-in approach triggered a significant loss in volume and budget.  “IP coverage is a huge issue and everyone knows and agrees with that factor,” Turek told me.

“There is also the fact of increasing your competition for a finite audience. We weren’t sure if there was an algorithmic system that says if you are narrowing your geographic targeting or adding other specifics about demos, you’re becoming more granular, essentially signaling to Google you are willing to pay more. Google says that’s not it at all,” said Turek.

Turek says two Google employees chatted with her after her talk, and one even told her he would recommend the opt-out approach to his clients.

A Second Test, Same Results

Mediative then tested the new approach with the same client in North Carolina in which they wanted to target just two DMAs. Turek says after seeing what happened in the Denver campaign, “we realized they were going to lose budget, clicks and impressions. We wanted to protect budget and volume.” So, they tested targeting the two DMAs versus running a state campaign and opting-out of the non-target DMAs.

They ran the test for 8 weeks. “We started seeing the efficiencies again with the opt-out approach,” said Turek.

North Carolina Geo-targeting testing

They were able to maintain traffic volume in the target DMAs in the opt-out test campaign. CPCs fell 12 percent. Better yet, conversions rose by 20 percent, and conversion rate rose 8 percent.

“Absolute conversions in target areas increased, and so we see increased account efficiency,” said Turek.

In the graph charting conversion rate changes below, you can see it rise in the two target DMAs (Charlotte in orange and Raleigh-Durham in purple) immediately after implementing the opt-out campaign after week 44.

geo-targeting conversion rate

Moving Forward With A New Approach & More Testing

Mediative has adopted the opt-out approach as a standard practice and rolled out the strategy across all states the client is targeting.  Turek says they had segmented this client’s account at the state level and then targeted DMAs. Now they have segmented the account by DMA and exclude at the zip code and city level.

“Next we’ll be testing with enhanced campaigns, so we will probably have new findings in a few months,” says Turek.

Marty Weintraub, Founder and Evangelist of aimClear was a fellow panelist with Turek at the SMX Advanced session. He mentioned he would be testing this approach, so I followed up with him to get his thoughts. Weintraub wrote,

“We’ve seen case studies that replicated her results in small to mid-size crucibles. The smaller the data, the less predicable the results. The bigger the data, the bigger (seemingly) the effect.

We’re about to test Marta’s theories on one of the largest AdWords accounts in America, so the data’s not in. At scale, the implications are massive. This was one the freshest ideas we’ve seen in recent memory, fitting for SMX Advanced. I could see numerous jaws drop to the floor from my position on the podium, even over the glare of SMX lights. Thanks to Marta for upholding our industry’s reputation for being magnanimous with such data.

Now it’s the same race we’ve seen so many times before…until Google takes note of the seam in the system and shuts it off. Cash in while you can people.”

Time will tell if Google takes action or addresses this new approach in some way. (The fact that one of the Google representative told Turek he’d recommend that approach to his clients is certainly interesting.)

In the meantime, enhanced campaigns are here either to add another wrinkle or iron them out, depending on your view. It will be fascinating to see more testing in this area.

It’s not everyday a best practice gets shaken to its core.

Complete SMX Advanced Presentation By Marta Turek

Related Topics: Channel: SEM | Google: AdWords | Search Marketing: Local Search Marketing | Top News


About The Author: writes about paid online marketing topics including paid search, paid social, display and retargeting. Beyond Search Engine Land, Ginny provides search marketing and demand generation advice for ecommerce companies. She can be found on Twitter as @ginnymarvin.

Connect with the author via: Email | Twitter


Get all the top search stories emailed daily!  


Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • Kye

    Please clarify: What would be the best practice in the Denver example? In other words, how would an opt-out strategy work? Seems like you specifically target Denver so that you can create ads that leverage Denver-specific language or Denver-specific landing pages.

  • FangDigitalMarketing

    Interesting data, however, we’ve seen just the opposite results when doing the same thing (and in some cases, the same markets).

    In many cases, we will silo out major DMAs of a national campaign so that we can discover more localized terms, test localized creative, and a variety of other factors. In all cases, we see the CPCs change, but that is because we are buying into a specific market, not because of the targeting (a comparison of the CPCs within that geo that can be found under the Dimensions tab proves as much). However, what we usually see is a massive increase in conversions, despite the fact that the national campaign is showing no signs of maxing out it’s budget, etc.

    Meanwhile, as we remove piece after piece of the national campaign, what is left there is the “dredges” of the rest of the country (which can be different for different products).

    While this article (and the presentation that caused it) does present some interesting theories, I have a feeling that the difference in the CPCs and clicks have more to do with the fact that you are basically creating a new campaign in a new marketing and thus, a new bidding marketplace, and less to do with certain areas being looked upon as a “premium.”

  • Ben

    I also was confused when I first read the article, but I think they are making the point of opt-out is better than opt-in. So targeting ONLY the Denver DMA (and I assume metro area) produced worse results than targeting Colorado and removing certain DMAs (as seen in the right-side NC map) even when using Denver-specific language.

    I wish they would have provided the same type of graphs for both tests rather than switching from Click % and CPCs for Denver to CVR for Raleigh (I assume it was an error showing the same CVR graph twice for Raleigh).

  • JM

    Amen to the comment about keeping the graphs consistent. It’s almost as if the person who wrote this purposefully wanted the data to be confusing.

  • Ginny Marvin

    JM and Ben — Duly noted on keeping the graphs consistent – and I’ve removed that repeated graph that somehow snuck in there, thx for the catch.

    The reason I chose to show the conversion rate graph is because the seasonality during the North Carolina (Thanksgiving into holidays) testing makes the impression and click data appear quite spikey. The conversion rate data is a clearer visual. But the point I tried to get across is that Mediative’s data showed they were able to maintain traffic volume (clicks and impressions) levels in the opt-out campaign and also saw absolute conversions and conversion rate increase compared to the campaign in which they specifically targeted the two DMAs.

    Marta included many more graphs in her presentation, which I would encourage you to check out. (I’ll get a link out.)

    Bottom line is this is meant to be a conversation starter on how we geo-target. Again, Marta was very quick to point out that the tests weren’t perfect, but they were compelled enough by what they saw in these two experiences that they are using an opt-out strategy with this client and aim to do more testing.

    There are others, like Marty at aimClear, who found Marta’s presentation persuasive enough to testing it themselves. They’ve seen similar results, though at a small scale. So it will be interesting to see this played out on a larger campaign. It’s also good to hear feedback like that of Fang Digital Marketing below.

  • JM

    True, makes sense. Thanks for all the insights. Please post the link when you can. This is definitely an awesome topic to bring to light. Awesome job on picking this up and bringing attention too.

  • Pat Grady

    Opt-outs (exclude) are like Exact matching, Opt-in (add (include)) are like Mod Broads. When you kill overlap via unavoidable IP looseness, tighter snare.

  • Marta Turek

    Interesting feedback – thanks for sharing!

    I think the slight difference though, is that what is described above is not to silo out major DMAs, but rather to target those same DMAs more efficiently.

    When we buy into a specific market (opt-in), absolutely, we see the same changes in CPCs that you describe – they go up!

    However, when we target that same market explicitly (opt-out) we then see a drop in CPCs exactly because we are not specifically selecting that market as an (opt-in). We are removing everything we don’t want to target, but still from a settings perspective, targeting the greater geography (i.e.: State Level)

    It will be fascinating to hear what everyone experiences, because as you mention, different markets, products, lead gen vs. e-commerce, local vs. national buying —- all of this and a plethora of other factors will impact the results.

    One point I would argue, is doing the test without creating a new campaign (which is what we have tested) — so you are not changing anything, except the geo-targeting settings — then see if that change alone has an impact on performance.

  • Marta Turek

    Kye – based on the findings, let’s assume the campaign is targeted at a State Level and you want to target Denver without losing volume. Rather than explicitly targeting Denver (by selecting the Denver City or Denver DMA in AdWords) you would instead exclude all the other DMAs outside of Denver while still maintaining State Level targeting.

    The idea is that rather than limiting your geo-targeting settings – you are excluding what you don’t want from the bigger pie, instead of cutting down the size of the pie.

  • Kye

    Makes perfect sense. Thanks for the clarification!

  • FangDigitalMarketing

    Actually, what I described is that CPCs go down… or up… or stay the same… it really has nothing to do with the siloing of the DMA, but more about the creative execution, additional keywords, etc.

    We haven’t tried your opt-out method because… well, why would we really? It’s not really the intent of the settings. However, technically, it should be the same results and that actually be an issue; however, to say that one is better than another probably isn’t accurate because the thesis that regular targeting via opt-in causes CPCs, etc. to go up is inaccurate, at least based on the cases that we can present here.

    In reality, there would need to be a LOT more testing done before we could generate a statistically significant amount of data to prove this point or even to prove that there is in fact a bug in the system.

    Hopefully that clears it up.

  • Keith Franco

    Logical. wish I had thought of it.

  • Dragan Pozder

    How were advanced location settings set up in those campaigns? It could have influenced the results.

  • Marta Turek

    Ah, thanks for clarifying – gotcha.

    So, the avg. CPCs going down was a by-product of the original goal. Absolutely concur that across different industries, products, campaigns, we may not see the same change in avg. CPCs.

    The original problem we were solving for was losing impressions if we went for a direct opt-in strategy at a DMA level. We knew we would lose impressions and clicks in the same DMAs we were already targeting at state level, if we simply explicitly targeted those DMAs.

    We have seen many gains from this approach – maintaining impression & click volume would be the simplest reason to test out this form of targeting, especially if the target area is smaller, thus making the volume opportunity more finite.

    I, in turn, hope that this clarifies what I was trying to say! :-)

  • Marshall

    It would be interesting to see if this scales to the country as a whole – i.e. would targeting the entire U.S. and then eliminating the states you don’t want, would have a similar effect.

  • Marta Turek

    Dragan – the location settings were static between control & experiment.

    Target: People in, searching for, or viewing pages about my
    targeted location

    Exclude: People in my excluded location

  • Ginny Marvin

    JM – Glad that helped. We have just embedded Marta’s full presentation from SMX Advanced at the end of the post. Let us know if any questions come up!

  • Sam

    Could you please clarify how you were able to exclude such large areas in the state, as seen in your DMA Region Target by Opt Out image? I’m currently working on a campaign out of California, and opting-out of everything except for a 70 mile radius around of Burbank seems as if it would take hours to do so.

  • FangDigitalMarketing

    Again, not saying you’re wrong… just that there needs to be a LOT more data before I see this as a “thing.”

    For instance, we actually see more impressions when we target regionally, plus, the extra bonus that, even though the old national campaign wasn’t complaining of a lack of budget, we were somehow getting additional impressions and spending more in budget once we broke it out regionally.

    Again, there is a chance that you’ve found a real bug in the form of their being a difference between the opt-in vs. opt-out method, but even that seems to be something that would benefit from greater research.

    My big question is, when you “test” these two methods, is it with the same set of keywords over the same period of time? Or was it more that you saw that increase in Denver and decided to try it a different way and saw different results later?

  • Dragan Pozder

    Marta, thanks for the answer.

    If the advanced location settings were the same, then campaigns didn’t target exactly the same amount (and location) of people, opt out was more narrow – it is excluding more than what was “excluded” by opt in with broader adv. loc. settings for “target to”.
    I’m not sure if that fact could have influenced results to that extent, just trying to find some logic behind your results.
    I will test the theory but with both campaigns set to:

    target: people in my location only
    exclude: people in my excluded location
    Do you think that would make test results more reliable?

  • Marta Turek

    To answer your questions:
    1. The tests were sequential – so not over the same period of time
    2. Yes, the keywords were the same
    3. We were solving for the problem of impression loss – because of what we saw happen in Denver – so we tried the opt-out strategy later, in a different way, as you put it, to try to mitigate for impression loss that we knew would occur with the same type of opt-in strategy that had been done for Denver.

    I agree that further testing is required by the industry to make this a ‘thing’, as you say. However, we think it is enough of a ‘thing’ to put it out there and get more testing done!

    It challenges the status quo, which is what I think makes it so exciting. Thanks for the conversation!

  • Marta Turek

    Hey Dragan, we kept the settings the same for the purpose of consistency. The settings were later changed to the ones you mention.

    I absolutely agree with your recommended location settings, which are exactly the settings that have been set up across all the campaigns, that are using this opt-out approach

    Good luck testing!

  • Marta Turek

    Test it! :-)

  • Marta Turek

    Sam – the opt out was easy in the image above, as we simply excluded Nielsen defined DMAs – so all we had to do was determine the DMAs in the state and then exclude them. You can exclude at a DMA level in AdWords – so that’s easy enough.

    Given you have some radius targeting, my approach would be to:
    a. Exclude the DMAs that you can (which are clearly removed from your target region)
    b. Then, depending on the DMA in which your target region falls into, exclude any other major cities in that DMA

    OR target the DMA in which your target location falls and then exclude the major cities / metros that you don’t want to capture

    You can then run a geo-report periodically to see if there are clicks coming in from non-target areas and you can add those into your exclusion list

    As Dragan mentions above, for this stratey, you would want to run with these advanced location settings:
    target: people in my location only
    exclude: people in my excluded location

  • Marshall

    Not currently working on any search clients, otherwise I would.

  • FangDigitalMarketing

    Thanks for the clarification.

    I’m not sure if it “challenges the status quo,” since it’s one (or maybe a few now) instances versus anywhere from 1.2-1.6 million active AdWords clients at any given moment, then a subset of those that are using regional targeting.

    I think if you guys had a statistically significant amount of clients (or even campaigns) that had experienced this, then maybe there would be a case that you’ve actually discovered some sort of targeting anomaly.

    At this point, to get a statistically significant number with a 95% confidence level, you’d need at least 384 pairs of tests running concurrently (+/-5%) (at the same time, with the same set of keywords, bids, etc.) where only the geographic targeting was changed – which isn’t that much, so you may actually have a shot.

    Even then, there’s a really good chance that you’re still not technically talking about the same data sets because of things like IP targets, etc. used to determine geography.

    That said, it’s an interesting theory; but I think it’s a little presumptuous to share something so damning at a conference or in an interesting trade that has so very little backup.

    Sorry, I’m really trying not to be the a-hole here, but things like this can be really dangerous (I see too many of them on the organic side as it is).

    Hopefully Google took a look at this article and is running the numbers against it’s entire dataset to determine if there’s anything to the theory.

    Good luck.

  • Melissa Mackey

    We are running a side-by-side test of this in one of our campaigns and are seeing the same results as Marta: higher CPC and cost/conversion for opt-in vs. opt-out. This isn’t a large scale test, but definitely warrants further testing.

  • Daniel Vardi

    How should we now deal with this when it comes to enhanced campaigns? meaning if we want to increase a bid % for a city within a state, or a state within the US, what happens then?

  • Scott Clark

    Implemented on local campaign somewhat similar to case study in article.. CTR + 12.5%, CPC -18%. Me happy.

  • Marta Turek

    Daniel – Enhanced Campaigns adds a level of complexity to everything for sure. Bid modifiers are a whole other conversation. There were some interesting presentations at SMX Advanced that cautioned the extensive use of bid modifiers

  • Ginny Marvin

    Sorry for the confusion, and thanks for the recommendation to use “include” vs “exclude” in the future. Appreciate the feedback.

  • Gavin Hudson

    We’ve been seeing similar results with Twitter geotargeting. Worldwide campaigns out-perform geotargeted campaigns for reach, conversions and CPC. I think it ultimately depends what conversion you have in mind. If it’s an action that anybody can take online then geotargeting will lower conversions and increase CPC.

  • Marry

    Geo targeting really cool

  • Brian Belgard

    Couldn’t agree with you more on this one. I didn’t see anything in this study that I took issue with, but I would want to see some more intensive controlled testing before I’d be willing to declare victory here. Certainly makes for a good testing hypothesis though.

  • Bryant Garvin

    The one caveat I will add t inclusion versus exclusion targeting is be careful if you use exclusion targeting because you only provide a product or service in specific areas. Especially if you maintain the standard option for Google to include “people in, searching for, or viewing pages about my targeted location”.

    We have found clients which have done this have ended up with a decent percentage of leads from those areas they were hoping to exclude. So just be careful as to why you are trying out the exclusion vs inclusion method of geo-targeting.

  • bobrichards

    nothing like learning with the client’s money

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest


Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States


Australia & China

Learn more about: SMX | MarTech

Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!



Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide