Google: Parked Domains, Scraper Sites Targeted Among New Search Changes

In what’s now to be a monthly update on search changes, a new Google “Inside Search” blog post today tells us that life is getting tougher for those with parked domains, life may get better for those plagued by scraper sites and those hoping to “push down” negative listings may have a tougher challenge.

New Monthly Search Update

The news comes from a post to the Google Inside Search blog, itemizing ten search-related changes that have been made.

Google did a similar post like this last month, and now it confirms this will be a monthly update on what it considers to be noteworthy changes but ones not big enough to merit blog posts of their own.

From the post:

Today we’re publishing another list of search improvements, beginning a monthly series where we’ll be sharing even more details about the algorithm and feature enhancements we make on a near-daily basis…

We’ve been wracking our brains trying to think about how to make search even more transparent. The good news is that we make roughly 500 improvements in a given year, so there’s always more to share. With this blog series, we’ll be highlighting many of the subtler algorithmic and visible feature changes we make. These are changes that aren’t necessarily big enough to warrant entire blog posts on their own.

While Google calls these all algorithm changes, some of them are really related to the search interface, while others impact how Google crawls, which is different from the algorithm that controls how Google ranks pages (to understand more about search algorithms, see our What Is SEO / Search Engine Optimization? page and watch the movie).

On to the changes:

Parked Domains Get Ticketed

One of the most significant changes is that Google says it has a new algorithm to detect parked domains. From the post:

New “parked domain” classifier: This is a new algorithm for automatically detecting parked domains. Parked domains are placeholder sites that are seldom useful and often filled with ads. They typically don’t have valuable content for our users, so in most cases we prefer not to show them.

That’s a pretty easy change to understand. Many domainers I’ve spoken to have understood over the years that it’s become harder to rank on Google without having some substantial content on their sites. This is a clear sign life is getting harder.

Ironically, Google’s own AdSense For Domains program has fueled much of the parked domain industry that its web search team is now penalizing against.

Rewarding The Original

Another big change is that Google says it can now better detect which is the “original” page when confronted with several that seem similar:

Original content: We added new signals to help us make better predictions about which of two similar web pages is the original one.

Within a web site, a variety of things can cause a page to be duplicated. However, a bigger issue for many publishers is when people copy or “scrape” their content without permission. These scraper sites sometimes can even outrank the original site for searches.

Google doesn’t specifically say this change is aimed at scraper sites, but it should help with that issue — and it’s an issue Google’s been especially been battling against since launching its Panda Update earlier this year.

Our previous post from August also talks more about this battle: Google Signals Upcoming Algorithm Change, Asks For Help With Scraper Sites.

For Google, it’s also another reason why publishers may want to consider using the canonical tag. The posts below have more about this:

Stop Crowding Me

A third big change was Google saying that it’s pulling back on allowing a single site to potentially occupy too much of the top search results. From the post:

Top result selection code rewrite: This code handles extra processing on the top set of results. For example, it ensures that we don’t show too many results from one site (“host crowding”). We rewrote the code to make it easier to understand, simpler to maintain and more flexible for future extensions.

We’ll try for a follow-up here to get further clarity, but about a year ago, Google made it possible for one site to have more than the usual single listing it might get at the top of the page. These stories, especially the second, explain more about this:

The change means that that a brand owner may occupy less of the search results page for a search on their name, so competitors or critics potentially turn up more.

Of course, brands like McDonald’s or Coca-Cola have so many additional sites, along with social media profiles, that they still do well in crowding out others.

My post from September, Should Rick Santorum’s “Google Problem” Be Fixed?, explains this more, at the end.

Rare Words Count For More

An interesting change is that if you’re searching for a “rare” or unusual word, Google is easing back on ignoring this and potentially returning matching web pages that might not contain that word.

From the post:

Sometimes we fetch results for queries that are similar to the actual search you type. This change makes it less likely that these results will rank highly if the original query had a rare word that was dropped in the alternate query. For example, if you are searching for [rare red widgets], you might not be as interested in a page that only mentions “red widgets.”

Bigger & Fresher

Elsewhere in the post, Google says that it is doing “more comprehensive indexing,” promising that this will make “more long-tail documents available in our index, so they are more likely to rank for relevant queries.”

Google also said that its blog search results are both more comprehensive and fresh. Image results were also said to be fresher.

Suggestions, Tablet Layout & Goal!

In the remaining changes, first, Google says that it will be offering more autocomplete suggestions. Second, it says it has made small changes to improve its look on tablets.

Finally, those looking for Major League Soccer and Canadian Football League scores, rejoice! Google says it will now display scores, schedules and links to game recaps and box scores for games.

Related Stories

Related Topics: Channel: SEO | Features: Analysis | Google: SEO | Google: Web Search | SEO: Duplicate Content | SEO: Host Crowding & Clustering | SEO: Spamming | Top News


About The Author: is a Founding Editor of Search Engine Land. He’s a widely cited authority on search engines and search marketing issues who has covered the space since 1996. Danny also serves as Chief Content Officer for Third Door Media, which publishes Search Engine Land and produces the SMX: Search Marketing Expo conference series. He has a personal blog called Daggle (and keeps his disclosures page there). He can be found on Facebook, Google + and microblogs on Twitter as @dannysullivan.

Connect with the author via: Email | Twitter | Google+ | LinkedIn


Get all the top search stories emailed daily!  


Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • Agam Panwar

    Thanks for the sum-up Danny! Among all the changes that happened, I liked the “stop crowding me” one – it was actually needed.

    - Agam

  • reshu jain

    Thanks for sharing such a important topic :) i visit this blog once in a week. you guys are really awesome and provide very informative post..:)

  • Mike Poller

    All “scraper” sites may not be created equal…
    The Museum of Modern Art does not create any new “content,” it just curates. The Google search engine itself does not create any new content, it just curates, on-the-fly and in real time. Curating has value.
    If I take all the press releases from all the book publishers, for example, and present them in one place, I am creating value by organizing information in time and place. But I did not create one new word of content. Google needs to examine this “scraper” algorithm and try to teach it to give some value to curation.

  • D.A.E.

    We’ve already received an email from SEDO the largest houser of such domains (we believe) saying exactly what you’ve spec’d above…

    This a big positive leap for search and UX.

  • P.I.

    Is there word on precisely when they’ll begin targeting scraper sites? I have several of them trolling my site, most come from the Russia, and the Philippines but lately I’ve been receiving suspicious visits from Luxembourg. It’s unfortunate that law enforcement can’t do more to take out scrapers but since many of them operate out of foreign countries and therefore out of the jurisdiction of US law enforcement, it falls to webmasters and the search engines e.g. Google, yahoo, Bing, etc. to flush them out.

  • Matthew Capala

    My first take is that I was quite surprised to learn about the “New ‘parked domain’ classifier,” which allows Google to detect parked domains more easily, making them less likely to show up in the SERP. Since when parked domains ever showed up in Google results? Have you ever seen one when you searched in Google?

    Parked Domains, to me, is a direct navigation game where you target people who misspelled domain names or use domain as a search medium (I guess some folks still search that way). Although on the decline, I have seen some folks making a nice chunk of change through parked domains.

    That said, if you have parked domains that you are monetizing through direct navigation, this algo change would not affect you much because, chances are, you were not listed in search engines on major keywords anyway. My only concern is that when Google associates a domain to a cash parking site, your domain can get devalued and it would take extra time and effort later when you want to use that domain to build a legit site.

    The algo change that impacts a lot of businesses is “Fresher and more complete blog search results.” This is good news for all blog owners, including me. On this note, if you have a website and you are serious about promoting it, you should have at least the website, a blog, and an integrated Twitter account (if you have a lot of time and money you should also explore other social media channels). What this update does is that Blog content now has a faster and deeper indexing system, making your blog even more valuable and likely to rank highly on Google.

    Other changes titled “More comprehensive indexing” or “Image result freshness” represent just couple improvements to Google’s search technology and, unless you are an SEO consultant, you should not dwell too much on.

    Now, very important one: “Original content.” Google has added new algo signals to help make better predictions about which of two similar web pages is the original one. Those in the content game should fear what it may do to Google SERP (especially after Google Panda), but if you write your own content (like me) you should not be worried. There are tons of websites out there that feed content from other sources and Google may devalue their ranks because of the lack of originality. To me, it is only fair that the author / content creator should get the credit from Google. Is it not?

    I think it is fair that Google is giving transparency to its algo changes. There is a whole industry built around Google’s search results (and I am in it!!!) so I welcome such announcements from Google. Those who remember how previous Google algo changes (recently Panda) rocked couple brands would agree. Listen, the Internet is all about level playing field where a small guy with great ideas and content can compete against Fortune 500 for eyeballs, so as long as Google is fair and rewards content originality, we should all sleep well.

  • BradChism

    Matthew I have seen them all the time, a business directory could even be a parked page depending on what you look at, or google will look at depending on a landing page. A Facebook page could also be considered a landing page depending how you look it as well. Since I am in the web business finding clients that don’t have a website is still very common.

    But I am very happy Google is taking this situation in consideration. Will make it a lot easier to sort through the web and show the value of hiring someone that is more of a specialist or expert instead of a kid off the street thinking a Facebook page is all you need.

  • TDD

    Regarding Google’s algorithm change, it wasn’t a big deal. It allows them to simply find parked domains easier. They’ve been delisting parked pages for quite some time now and parking has been on the decline for years, so there’s no news there – they’ll just be hitting more of them now. Parked pages haven’t been able to secure major rankings for 3+ years now except for extremely rare cases. In that time, no one familiar with domains and parking has been expecting to buy domains, park them, get search traffic and become rich – anyone who has didn’t do their homework first.

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest


Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States


Australia & China

Learn more about: SMX | MarTech

Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!



Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide