Can You Dig Out Of Your Google Panda Hole By Offloading To Subdomains?

The Wall Street Journal has an article out today reporting that Hubpages, acting on advice from Google, has found that using subdomains seems to have gotten some of their content to recover from a Panda penalty.

By moving some of their content to new subdomains, it either has escaped the penalty or is being assessed afresh.

The article said:

In June, a top Google search engineer, Matt Cutts, wrote to Edmondson that he might want to try subdomains, among other things.

The HubPages subdomain testing began in late June and already has shown positive results. Edmondson’s own articles on HubPages, which saw a 50% drop in page views after Google’s Panda updates, have returned to pre-Panda levels in the first three weeks since he activated subdomains for himself and several other authors. The other authors saw significant, if not full, recoveries of Web traffic.

Panda is known as a site-wide, domain specific penalty. Some of the advice given by Google was to move low-quality content to a new domain or remove the content completely. It seems like Google, in this case, is treating the subdomains on Hubpages as separate domains and thus when moving the low-quality content from the main www to the subdomain on Hubpages, Google is treating them as separate domains.

But does Google treat all subdomains as unique domains? Read Vanessa Fox’s article, How Changes To The Way Google Handles Subdomains Impact SEO.

A Google spokesperson gave us an insightful comment about the use of subdomains for this purpose. Google said:

Subdomains can be useful to separate out content that is completely different from the rest of a site — for example, on domains such as However, site owners should not expect that simply adding a new subdomain on a site will trigger a boost in ranking.

More Google Panda Tips & Advice:

Related Topics: Channel: SEO | Google: Panda Update | Google: SEO | Panda Update News | Panda Update Tips


About The Author: is Search Engine Land's News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on very advanced SEM topics. Barry's personal blog is named Cartoon Barry and he can be followed on Twitter here. For more background information on Barry, see his full bio over here.

Connect with the author via: Email | Twitter | Google+ | LinkedIn


Get all the top search stories emailed daily!  


Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • Jaan

    I reacted in a similar manner, removing dup content and changing URLs structure for good unique content. No success. This site is much smaller in back link and overall authority versus Hubpages. I feel that Hubpages quick bounce back has to do with that, but will it last?

  • Michael Martinez

    There are a number of cautionary points regarding this experiment. First, he didn’t begin diversifying content until late June — probably after the last Panda release. Hence, whatever he does now has yet to be evaluated by Panda.

    Second, HubPages probably has a lot of diversified content so if the problem is with the content itself (and not the site design) then splitting that content up across sub-domains should help distinguish between the good and the bad. Then he’ll have to decide what to do with the bad content.

    Third, if the diversification doesn’t show consistent progress past the next 2 Panda releases, he’ll have done a lot of work for nothing.

  • RBM

    A very interesting and thought provoking article, Barry.

    Makes me wonder how simplistic the Google URL ranking factor must be. And this is one more SEO factor (along with load time) to check out before deciding where to have one’s blog hosted — evidently you’re better off if they put you in a subdomain where you’re not dragged down by weaker blogs using the same service. This could be a big deal for blog hosting companies and ISPs.

    Thanks for the info.

  • cameron counselor

    As a freelance writer for a site called Seekyt on the web and I am excited to hear about how sub-domains can help content sharing sites. It will be interesting to see how things pan out during the holiday months for those writers that use Amazon.

    Maybe Hubpages will be a chance to see how large sites can recover from the Panda update, but it may be too soon to tell. I will be keeping my eyes open and comparing writing sites to see which one is best for freelancers.

  • Mark Collier

    It’s not the subdomain itself that is causing the increase in traffic from Google, it is Google’s new ability to segment and understand separate content on the site.

    What this shows is that Google has failed from a crawling and indexing point of view and is still failing some sites in that previously they were awarding site wide credit, meaning every new article was likely to rank highly.

    Then with the Panda update they removed some if not all of that credit for content farms and now the solution seems to be to break up your content so that Google can segment and index the content on your site they way they should be.

    Realistically Google should be able to determine the quality of an article based on the content on that page and page specific ranking signals, but Google has relied too heavily on the site wide factors and are now paying the price with poor search results in some cases.

    Surely Google must refocus on finding difficult to manipulate, page specific factors and reduce the weighting applied to site wide factors.

  • TimmyTime

    \Surely Google must refocus on finding difficult to manipulate, page specific factors and reduce the weighting applied to site wide factors.\

    Google is no rush for that. frankly it has no incentive, they aren’t suffering YOU ARE.

    Google’s earnings went through the roof for the first quarter of full Panda and the bloggers and tech reporters are too busy kissing Google’s @ss to ask any questions.

  • Stupidscript

    There are two big caveats, here:

    1) Hubpages (like gets its content from individual writers of varying quality. Unlike, that content was not previously segregated into separate subdomains … everything USED to be in subDIRECTORIES under the primary “www.” machine name.

    2) This is no way produced good results for the BAD content. What Hubpages experiment did was to allow the GOOD content to be recognized amongst the bad apples.

    For example … if ALL of your content is at “”, and its subDIRECTORIES, then ALL of your content is within the same “” domain. Good and bad content are rated as belonging to the same container. When the bad content drags down the container (the domain), the good content goes down with it.

    If you separate out the “bad” content, and place it in a subDOMAIN, e.g. “”, then that content is no longer part of the “” domain … it now exists exclusively in its own container (

    With the bad content separated out, Google is better-able to distinguish the “good” content left at “”, so it ranks “” content higher.

    “” never gets that boost because its content is all “bad”, and it languishes at the bottom of the pile until the site owner cares enough to remove it entirely. has always maintained separate subdomains for its users, allowing each subdomain to rise or fall on its own merits without being impacted by crappy writing elsewhere within the “*” universe.

    Just a little clarification.

  • Ben Royce

    Michael Martinez is right. This was done in June and global iterations of Panda have gone through.

    One of my sites did the subdomain trick in early March and it worked well until the next iteration. This isn’t so much an algorithm change but more of a manual cleansing of the SERPs.

    I’d be wary of this as a long term solution, especially since it was in the WSJ.

  • akronhealthcare

    This update was not good although my sites did not got affected but still i believe these type of update is not good..

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest


Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States


Australia & China

Learn more about: SMX | MarTech

Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!



Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide