• Brad Geddes

    Google should have been more specific in this and not made it a general guideline for Googlebot.

    You can & should block Gooblebot from crawling your dedicated ad landing pages. However, if you are using AdWords; then you should allows adsbot-Google to crawl your landing pages (you’ll have quality score issues if you don’t).

  • Andrew Goodman

    This might bring up an interesting point, though it’s probably too much worry for most marketers. You might not want tailored “ad tested” offer pages to be shown to Googlebot (organic side), if you create a large number of such pages, are worried about duplicate content, canonical versions of pages ranking well in organic search, etc. But you would want the adsbot to crawl them so there isn’t any hole in your landing page quality info. As to how many advertisers & marketers are actually thinking about this stuff, no doubt very few. Most simply don’t block the bots, which seems like the easiest way to approach it.

  • Pat Grady

    Side note here:

    Expand the section called “Expanding your landing pages from review”… it says:

    “Note that these instructions apply only to AdsBot-Google. There are other Google-owned bots that review websites as well (googlebot, for example). In order to avoid decreasing Quality Scores and increasing CPCs for advertisers who don’t intend to restrict AdWords visits to their pages, the system will ignore blanket exclusions and wildcards (for example, User-agent: *) in robots.txt files.”

    They’ve seen too many folks accidentally block their own pages from the QS ads bot, trying to save us from ourselves. :-)

  • RightTech

    Do you need to block your dedicated landing pages using robots.txt, or can you use the meta robots No Index tag to prevent that page from showing up in organic SERPs and still be sure the adsbot crawls your PPC landing page?

  • Brad Geddes

    Noindex usually works best in keeping the page out of the index.

    However, as a backup plan; I’ll put all my landing pages in a folder like example.com/lp/ and then block that folder with robots.txt as well.