• http://www.jaankanellis.com Jaan

    I reacted in a similar manner, removing dup content and changing URLs structure for good unique content. No success. This site is much smaller in back link and overall authority versus Hubpages. I feel that Hubpages quick bounce back has to do with that, but will it last?

  • http://www.michael-martinez.com/ Michael Martinez

    There are a number of cautionary points regarding this experiment. First, he didn’t begin diversifying content until late June — probably after the last Panda release. Hence, whatever he does now has yet to be evaluated by Panda.

    Second, HubPages probably has a lot of diversified content so if the problem is with the content itself (and not the site design) then splitting that content up across sub-domains should help distinguish between the good and the bad. Then he’ll have to decide what to do with the bad content.

    Third, if the diversification doesn’t show consistent progress past the next 2 Panda releases, he’ll have done a lot of work for nothing.

  • RBM

    A very interesting and thought provoking article, Barry.

    Makes me wonder how simplistic the Google URL ranking factor must be. And this is one more SEO factor (along with load time) to check out before deciding where to have one’s blog hosted — evidently you’re better off if they put you in a subdomain where you’re not dragged down by weaker blogs using the same service. This could be a big deal for blog hosting companies and ISPs.

    Thanks for the info.

  • http://seekyt.com/all/user/careercounselor/ cameron counselor

    As a freelance writer for a site called Seekyt on the web and I am excited to hear about how sub-domains can help content sharing sites. It will be interesting to see how things pan out during the holiday months for those writers that use Amazon.

    Maybe Hubpages will be a chance to see how large sites can recover from the Panda update, but it may be too soon to tell. I will be keeping my eyes open and comparing writing sites to see which one is best for freelancers.

  • http://www.theopenalgorithm.com Mark Collier

    It’s not the subdomain itself that is causing the increase in traffic from Google, it is Google’s new ability to segment and understand separate content on the site.

    What this shows is that Google has failed from a crawling and indexing point of view and is still failing some sites in that previously they were awarding site wide credit, meaning every new article was likely to rank highly.

    Then with the Panda update they removed some if not all of that credit for content farms and now the solution seems to be to break up your content so that Google can segment and index the content on your site they way they should be.

    Realistically Google should be able to determine the quality of an article based on the content on that page and page specific ranking signals, but Google has relied too heavily on the site wide factors and are now paying the price with poor search results in some cases.

    Surely Google must refocus on finding difficult to manipulate, page specific factors and reduce the weighting applied to site wide factors.

  • TimmyTime

    \Surely Google must refocus on finding difficult to manipulate, page specific factors and reduce the weighting applied to site wide factors.\

    Google is no rush for that. frankly it has no incentive, they aren’t suffering YOU ARE.

    Google’s earnings went through the roof for the first quarter of full Panda and the bloggers and tech reporters are too busy kissing Google’s @ss to ask any questions.

  • Stupidscript

    There are two big caveats, here:

    1) Hubpages (like WordPress.com) gets its content from individual writers of varying quality. Unlike WordPress.com, that content was not previously segregated into separate subdomains … everything USED to be in subDIRECTORIES under the primary “www.” machine name.

    2) This is no way produced good results for the BAD content. What Hubpages experiment did was to allow the GOOD content to be recognized amongst the bad apples.

    For example … if ALL of your content is at “www.example.com”, and its subDIRECTORIES, then ALL of your content is within the same “www.example.com” domain. Good and bad content are rated as belonging to the same container. When the bad content drags down the container (the domain), the good content goes down with it.

    If you separate out the “bad” content, and place it in a subDOMAIN, e.g. “bad.example.com”, then that content is no longer part of the “www.example.com” domain … it now exists exclusively in its own container (bad.example.com)

    With the bad content separated out, Google is better-able to distinguish the “good” content left at “www.example.com”, so it ranks “www.example.com” content higher.

    “bad.example.com” never gets that boost because its content is all “bad”, and it languishes at the bottom of the pile until the site owner cares enough to remove it entirely.

    WordPress.com has always maintained separate subdomains for its users, allowing each subdomain to rise or fall on its own merits without being impacted by crappy writing elsewhere within the “*.wordpress.com” universe.

    Just a little clarification.

  • http://benjaminroyce.com Ben Royce

    Michael Martinez is right. This was done in June and global iterations of Panda have gone through.

    One of my sites did the subdomain trick in early March and it worked well until the next iteration. This isn’t so much an algorithm change but more of a manual cleansing of the SERPs.

    I’d be wary of this as a long term solution, especially since it was in the WSJ.

  • akronhealthcare

    This update was not good although my sites did not got affected but still i believe these type of update is not good..