• Kye

    Please clarify: What would be the best practice in the Denver example? In other words, how would an opt-out strategy work? Seems like you specifically target Denver so that you can create ads that leverage Denver-specific language or Denver-specific landing pages.

  • FangDigitalMarketing

    Interesting data, however, we’ve seen just the opposite results when doing the same thing (and in some cases, the same markets).

    In many cases, we will silo out major DMAs of a national campaign so that we can discover more localized terms, test localized creative, and a variety of other factors. In all cases, we see the CPCs change, but that is because we are buying into a specific market, not because of the targeting (a comparison of the CPCs within that geo that can be found under the Dimensions tab proves as much). However, what we usually see is a massive increase in conversions, despite the fact that the national campaign is showing no signs of maxing out it’s budget, etc.

    Meanwhile, as we remove piece after piece of the national campaign, what is left there is the “dredges” of the rest of the country (which can be different for different products).

    While this article (and the presentation that caused it) does present some interesting theories, I have a feeling that the difference in the CPCs and clicks have more to do with the fact that you are basically creating a new campaign in a new marketing and thus, a new bidding marketplace, and less to do with certain areas being looked upon as a “premium.”

  • Ben

    I also was confused when I first read the article, but I think they are making the point of opt-out is better than opt-in. So targeting ONLY the Denver DMA (and I assume metro area) produced worse results than targeting Colorado and removing certain DMAs (as seen in the right-side NC map) even when using Denver-specific language.

    I wish they would have provided the same type of graphs for both tests rather than switching from Click % and CPCs for Denver to CVR for Raleigh (I assume it was an error showing the same CVR graph twice for Raleigh).

  • JM

    Amen to the comment about keeping the graphs consistent. It’s almost as if the person who wrote this purposefully wanted the data to be confusing.

  • Ginny Marvin

    JM and Ben — Duly noted on keeping the graphs consistent – and I’ve removed that repeated graph that somehow snuck in there, thx for the catch.

    The reason I chose to show the conversion rate graph is because the seasonality during the North Carolina (Thanksgiving into holidays) testing makes the impression and click data appear quite spikey. The conversion rate data is a clearer visual. But the point I tried to get across is that Mediative’s data showed they were able to maintain traffic volume (clicks and impressions) levels in the opt-out campaign and also saw absolute conversions and conversion rate increase compared to the campaign in which they specifically targeted the two DMAs.

    Marta included many more graphs in her presentation, which I would encourage you to check out. (I’ll get a link out.)

    Bottom line is this is meant to be a conversation starter on how we geo-target. Again, Marta was very quick to point out that the tests weren’t perfect, but they were compelled enough by what they saw in these two experiences that they are using an opt-out strategy with this client and aim to do more testing.

    There are others, like Marty at aimClear, who found Marta’s presentation persuasive enough to testing it themselves. They’ve seen similar results, though at a small scale. So it will be interesting to see this played out on a larger campaign. It’s also good to hear feedback like that of Fang Digital Marketing below.

  • JM

    True, makes sense. Thanks for all the insights. Please post the link when you can. This is definitely an awesome topic to bring to light. Awesome job on picking this up and bringing attention too.

  • Pat Grady

    Opt-outs (exclude) are like Exact matching, Opt-in (add (include)) are like Mod Broads. When you kill overlap via unavoidable IP looseness, tighter snare.

  • Marta Turek

    Interesting feedback – thanks for sharing!

    I think the slight difference though, is that what is described above is not to silo out major DMAs, but rather to target those same DMAs more efficiently.

    When we buy into a specific market (opt-in), absolutely, we see the same changes in CPCs that you describe – they go up!

    However, when we target that same market explicitly (opt-out) we then see a drop in CPCs exactly because we are not specifically selecting that market as an (opt-in). We are removing everything we don’t want to target, but still from a settings perspective, targeting the greater geography (i.e.: State Level)

    It will be fascinating to hear what everyone experiences, because as you mention, different markets, products, lead gen vs. e-commerce, local vs. national buying —- all of this and a plethora of other factors will impact the results.

    One point I would argue, is doing the test without creating a new campaign (which is what we have tested) — so you are not changing anything, except the geo-targeting settings — then see if that change alone has an impact on performance.

  • Marta Turek

    Kye – based on the findings, let’s assume the campaign is targeted at a State Level and you want to target Denver without losing volume. Rather than explicitly targeting Denver (by selecting the Denver City or Denver DMA in AdWords) you would instead exclude all the other DMAs outside of Denver while still maintaining State Level targeting.

    The idea is that rather than limiting your geo-targeting settings – you are excluding what you don’t want from the bigger pie, instead of cutting down the size of the pie.

  • Kye

    Makes perfect sense. Thanks for the clarification!

  • FangDigitalMarketing

    Actually, what I described is that CPCs go down… or up… or stay the same… it really has nothing to do with the siloing of the DMA, but more about the creative execution, additional keywords, etc.

    We haven’t tried your opt-out method because… well, why would we really? It’s not really the intent of the settings. However, technically, it should be the same results and that actually be an issue; however, to say that one is better than another probably isn’t accurate because the thesis that regular targeting via opt-in causes CPCs, etc. to go up is inaccurate, at least based on the cases that we can present here.

    In reality, there would need to be a LOT more testing done before we could generate a statistically significant amount of data to prove this point or even to prove that there is in fact a bug in the system.

    Hopefully that clears it up.

  • Keith Franco

    Logical. wish I had thought of it.

  • Dragan Pozder

    How were advanced location settings set up in those campaigns? It could have influenced the results.

  • Marta Turek

    Ah, thanks for clarifying – gotcha.

    So, the avg. CPCs going down was a by-product of the original goal. Absolutely concur that across different industries, products, campaigns, we may not see the same change in avg. CPCs.

    The original problem we were solving for was losing impressions if we went for a direct opt-in strategy at a DMA level. We knew we would lose impressions and clicks in the same DMAs we were already targeting at state level, if we simply explicitly targeted those DMAs.

    We have seen many gains from this approach – maintaining impression & click volume would be the simplest reason to test out this form of targeting, especially if the target area is smaller, thus making the volume opportunity more finite.

    I, in turn, hope that this clarifies what I was trying to say! :-)

  • Marshall

    It would be interesting to see if this scales to the country as a whole – i.e. would targeting the entire U.S. and then eliminating the states you don’t want, would have a similar effect.

  • Marta Turek

    Dragan – the location settings were static between control & experiment.

    Target: People in, searching for, or viewing pages about my
    targeted location

    Exclude: People in my excluded location

  • Ginny Marvin

    JM – Glad that helped. We have just embedded Marta’s full presentation from SMX Advanced at the end of the post. Let us know if any questions come up!

  • Sam

    Could you please clarify how you were able to exclude such large areas in the state, as seen in your DMA Region Target by Opt Out image? I’m currently working on a campaign out of California, and opting-out of everything except for a 70 mile radius around of Burbank seems as if it would take hours to do so.

  • FangDigitalMarketing

    Again, not saying you’re wrong… just that there needs to be a LOT more data before I see this as a “thing.”

    For instance, we actually see more impressions when we target regionally, plus, the extra bonus that, even though the old national campaign wasn’t complaining of a lack of budget, we were somehow getting additional impressions and spending more in budget once we broke it out regionally.

    Again, there is a chance that you’ve found a real bug in the form of their being a difference between the opt-in vs. opt-out method, but even that seems to be something that would benefit from greater research.

    My big question is, when you “test” these two methods, is it with the same set of keywords over the same period of time? Or was it more that you saw that increase in Denver and decided to try it a different way and saw different results later?

  • Dragan Pozder

    Marta, thanks for the answer.

    If the advanced location settings were the same, then campaigns didn’t target exactly the same amount (and location) of people, opt out was more narrow – it is excluding more than what was “excluded” by opt in with broader adv. loc. settings for “target to”.
    I’m not sure if that fact could have influenced results to that extent, just trying to find some logic behind your results.
    I will test the theory but with both campaigns set to:

    target: people in my location only
    exclude: people in my excluded location
    Do you think that would make test results more reliable?

  • Marta Turek

    To answer your questions:
    1. The tests were sequential – so not over the same period of time
    2. Yes, the keywords were the same
    3. We were solving for the problem of impression loss – because of what we saw happen in Denver – so we tried the opt-out strategy later, in a different way, as you put it, to try to mitigate for impression loss that we knew would occur with the same type of opt-in strategy that had been done for Denver.

    I agree that further testing is required by the industry to make this a ‘thing’, as you say. However, we think it is enough of a ‘thing’ to put it out there and get more testing done!

    It challenges the status quo, which is what I think makes it so exciting. Thanks for the conversation!

  • Marta Turek

    Hey Dragan, we kept the settings the same for the purpose of consistency. The settings were later changed to the ones you mention.

    I absolutely agree with your recommended location settings, which are exactly the settings that have been set up across all the campaigns, that are using this opt-out approach

    Good luck testing!

  • Marta Turek

    Test it! :-)

  • Marta Turek

    Sam – the opt out was easy in the image above, as we simply excluded Nielsen defined DMAs – so all we had to do was determine the DMAs in the state and then exclude them. You can exclude at a DMA level in AdWords – so that’s easy enough.

    Given you have some radius targeting, my approach would be to:
    a. Exclude the DMAs that you can (which are clearly removed from your target region)
    b. Then, depending on the DMA in which your target region falls into, exclude any other major cities in that DMA

    OR target the DMA in which your target location falls and then exclude the major cities / metros that you don’t want to capture

    You can then run a geo-report periodically to see if there are clicks coming in from non-target areas and you can add those into your exclusion list

    As Dragan mentions above, for this stratey, you would want to run with these advanced location settings:
    target: people in my location only
    exclude: people in my excluded location

  • Marshall

    Not currently working on any search clients, otherwise I would.

  • FangDigitalMarketing

    Thanks for the clarification.

    I’m not sure if it “challenges the status quo,” since it’s one (or maybe a few now) instances versus anywhere from 1.2-1.6 million active AdWords clients at any given moment, then a subset of those that are using regional targeting.

    I think if you guys had a statistically significant amount of clients (or even campaigns) that had experienced this, then maybe there would be a case that you’ve actually discovered some sort of targeting anomaly.

    At this point, to get a statistically significant number with a 95% confidence level, you’d need at least 384 pairs of tests running concurrently (+/-5%) (at the same time, with the same set of keywords, bids, etc.) where only the geographic targeting was changed – which isn’t that much, so you may actually have a shot.

    Even then, there’s a really good chance that you’re still not technically talking about the same data sets because of things like IP targets, etc. used to determine geography.

    That said, it’s an interesting theory; but I think it’s a little presumptuous to share something so damning at a conference or in an interesting trade that has so very little backup.

    Sorry, I’m really trying not to be the a-hole here, but things like this can be really dangerous (I see too many of them on the organic side as it is).

    Hopefully Google took a look at this article and is running the numbers against it’s entire dataset to determine if there’s anything to the theory.

    Good luck.

  • Melissa Mackey

    We are running a side-by-side test of this in one of our campaigns and are seeing the same results as Marta: higher CPC and cost/conversion for opt-in vs. opt-out. This isn’t a large scale test, but definitely warrants further testing.

  • Daniel Vardi

    How should we now deal with this when it comes to enhanced campaigns? meaning if we want to increase a bid % for a city within a state, or a state within the US, what happens then?

  • http://www.buzzmaven.com/ Scott Clark

    Implemented on local campaign somewhat similar to case study in article.. CTR + 12.5%, CPC -18%. Me happy.

  • Marta Turek

    Daniel – Enhanced Campaigns adds a level of complexity to everything for sure. Bid modifiers are a whole other conversation. There were some interesting presentations at SMX Advanced that cautioned the extensive use of bid modifiers

  • Ginny Marvin

    Sorry for the confusion, and thanks for the recommendation to use “include” vs “exclude” in the future. Appreciate the feedback.

  • https://plus.google.com/u/0/?tab=wX#109016704019682085950/about/ Gavin Hudson

    We’ve been seeing similar results with Twitter geotargeting. Worldwide campaigns out-perform geotargeted campaigns for reach, conversions and CPC. I think it ultimately depends what conversion you have in mind. If it’s an action that anybody can take online then geotargeting will lower conversions and increase CPC.

  • http://www.onlineessay.us/ Marry

    Geo targeting really cool

  • http://www.facebook.com/profile.php?id=32606540 Brian Belgard

    Couldn’t agree with you more on this one. I didn’t see anything in this study that I took issue with, but I would want to see some more intensive controlled testing before I’d be willing to declare victory here. Certainly makes for a good testing hypothesis though.

  • Bryant Garvin

    The one caveat I will add t inclusion versus exclusion targeting is be careful if you use exclusion targeting because you only provide a product or service in specific areas. Especially if you maintain the standard option for Google to include “people in, searching for, or viewing pages about my targeted location”.

    We have found clients which have done this have ended up with a decent percentage of leads from those areas they were hoping to exclude. So just be careful as to why you are trying out the exclusion vs inclusion method of geo-targeting.

  • bobrichards

    nothing like learning with the client’s money