Google Panda Two Years Later: 5 Questions With HubPages CEO Paul Edmondson
(Editor’s Note: This is the final article in a 3-part series looking at the aftermath of Google’s Panda algorithm update, which launched February 24, 2011. To catch up, please see the first two articles in the series: Google Panda Two Years Later: Losers Still Losing & One Real Recovery and Google Panda Two Years Later: […]
(Editor’s Note: This is the final article in a 3-part series looking at the aftermath of Google’s Panda algorithm update, which launched February 24, 2011. To catch up, please see the first two articles in the series: Google Panda Two Years Later: Losers Still Losing & One Real Recovery and Google Panda Two Years Later: The Real Impact Beyond Rankings & SEO Visibility.)
HubPages.com fits both descriptions.
Launched in 2006, the site currently has about 130,000 authors that have published more than 1.1 million articles, or “hubs,” as the site calls them. It was regularly listed as one of Panda’s bigger losers, and CEO Paul Edmonson confirms that below when he says Panda “caused a massive loss of traffic and revenue.” But, unlike at least some of the other sites that Panda hit hard, HubPages isn’t planning to change course; its focus is on improving quality.
In the course of researching the first two articles in this series, I reached out to a couple companies that were hit by Panda (and were relatively easy to contact on a tight deadline). HubPages was the first company to respond; others said they were unable to answer questions due to timing, travel, and so forth. With that in mind, our 3-part series on Panda’s second anniversary ends with this email interview with HubPages CEO Paul Edmonson.
5 Questions With HubPages CEO Paul Edmondson About The Google Panda Update
Matt McGee: How would you describe the initial impact of the Panda update on your website?
Paul Edmonson: The Panda update caused a massive loss of traffic and revenue. It initially felt to us like a very blunt instrument that hit the entire site in an effort to impact low quality content (as Google sees it) that was present on a proportion of our domain.
I was a bit surprised that they would opt to utilize a brute force tactic as opposed to taking a more surgical approach in which low-quality pages would be singled out and given lower rankings. We felt Google’s actions communicated a sentiment that content is expendable on a massive scale.
Understanding that Google’s search algorithm updates are carefully constructed and built upon goals we share (to improve the quality of content encountered by online audiences), we immediately went to work to find patterns and address internal quality issues accordingly.
How did you react to Panda in the short-term two years ago, and how have you reacted over the long-term?
Our response involved major shifts in both our technical and financial strategies.
Financially, we had to hire more people to moderate the site’s content. At the same time, revenues were down significantly, and the traffic we were still receiving had a substantially lower value.
As subsequent Panda updates further diminished our income, we had to lay off several people and have continued to adjust our cost structure to adjust to the constant fluctuations in traffic.
Technically, we tested different means of breaking out the site to see if Google would treat content and authors differently. Ultimately, we chose to go with subdomains. [Ed. note: For more on this, see our article: Can You Dig Out Of Your Google Panda Hole By Offloading To Subdomains?]
We also built technology to check and see if our site’s content was previously published elsewhere in an effort to remove duplicate content from our site and prevent more from being added.
After we heard from Google that one of our authors (who we thought was pretty good) had quality issues, we started building systems to assess quality on a page-by-page basis in an effort to establish better quality guidelines. This is still a work in progress.
We have learned a lot about the character and tenacity of our team and the HubPages community. Despite Google’s constant changes, we persist and are optimistic about the future. We know that there is a way forward, and we will find it.
What other things did HubPages try before choosing subdomains?
We looked at subdirectories by author as opposed to subdomains. We didn’t test this widely, but it appeared to lead to similar results. We felt that individuals would have to market their own sites more and that subdomains gave the author a stronger sense of ownership. We considered organizing subdomains by topic, but after studying sites organized this way, we didn’t think that particular approach would generate enough of a separation for Google.
We no-indexed a lot of pages, such as tag pages and questions without answers in an effort to remove “thin” pages. We kept portions of the site, such as our Question and Answer section and our Forums, organized slightly differently for an extended period of time. Q&A was eventually moved to subdomains. We’ve kept the HubPages’ Forums on the HubPages.com domain, but no-indexed many of the posts.
Did the company ever think about abandoning user-generated articles altogether? (Like Suite101.com is doing, for example.)
No. The most important thing to understand is that HubPages, at its core, is a community of passionate people. What we have done and continue to work on is a quality system that identifies content people like. In some ways, our system resembles algorithms that Google is trying to write, but our approach is driven largely by human assessments. We think it’s really important that unique voices have the opportunity to be discovered, so we are working on systems that are flexible enough to feature content that people will like, but not so rigid that they compromise individuality. Our systems feature everything from the humorist style of Mark Ewbie, to the work of a person that’s been on HubPages for less than a year.
In your opinion, was the Panda update a good thing?
I don’t think we will know if Panda was a good thing for years to come. It’s more difficult for individuals to find an audience now. Many new high-quality pages never see exposure in Google.
I think that Panda slowed content creation significantly because of continued fluctuation in the type of content to which Google sends traffic and the increased risk taken on by content creators who are far less sure that their content will ever get exposure.
Longer, richer pages are more expensive to create, but our data shows that as the quality of a page increases, its effective revenue decreases. There will have to be a pretty significant shift in traffic to higher quality pages to make them financially viable to create. This economic change may create an opportunity for genuine, independent enthusiasts (who enjoy a lower cost structure and the ability to vertically integrate all aspects of the content creation process) to succeed.
For more about the Google Panda update, read through these categories in our article library:
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.
New on Search Engine Land