Why Google Needs The Manipulative Web
The Internet sometimes doesn’t work as we hope it would, or think it should. In certain verticals, great content doesn’t create great links, and because of that, in my opinion, SEOs are often forced to resort to manipulative link practices to get their clients and websites to rank. Websites with low “content link efficacy” are […]
The Internet sometimes doesn’t work as we hope it would, or think it should. In certain verticals, great content doesn’t create great links, and because of that, in my opinion, SEOs are often forced to resort to manipulative link practices to get their clients and websites to rank.
Websites with low “content link efficacy” are vertically positioned in areas that aren’t socially friendly, such as health insurance or payday loans, and because of this, are often incapable of truly (and naturally) driving lots of links to their site, no matter how impressive and link-worthy their content is.
Websites with high content link efficacy, on the other hand, can occur when sites expect their fair share of links when great effort is put into a piece – because it exists in a space, such as Funny or Die or I Can Has Cheezburger, where users aren’t afraid to use word-of-mouth and also, because the users expect that their friends would also enjoy the content.
In verticals with low content link efficacy, many sites are ranked almost exclusively on the basis of their manipulative link acquisition practices – and very little on the sheer strength of their content.
Google’s Stance On Manipulative Linking
Google publicly recommends that webmasters do not “participate in link schemes designed to increase your site’s ranking or PageRank“. So, given what we know about certain content verticals, and also, that Google is “against” link manipulation for the acquisition of PageRank – something doesn’t add up.
If these verticals were to exist solely on content strength alone, they’d do so based more on random chance and variance more than anything else. Who happened to discover what site? Did Robert Scoble happen to have an explicit interest in pornography that day on his blog, and instead of by quality, picked one with a terrible reputation – or just otherwise, happened to pick the worst of the top 10?
What if a high quality news site decided to link out based on something negative a business did? These kinds of things would lead to razor slim link profiles – elastic SERPs – and a final first results page that damages user experience.
My argument, then, is that Google needs the manipulative web.
In these two environments, one where sites are chosen mostly by variance and other sporadic factors, against the other – one where SEOs are forced into manipulative link practices – it is my thesis that the manipulative environment supplies better search results.
Don’t get me wrong, these results won’t be nearly as good as those driven in the verticals with high content link efficacy – but beggars can’t be choosers. These websites that have the ability to invest in manipulative link practices are still, in some outside way, showing that they have the resources and backend to stimulate link acquisition – even if it’s not in the way we like. Venture capital investment, money to throw at links, business connections – all of these things in some way correlate to a strong website.
In this environment, the websites with the ability to allocate resources to manipulative link acquisition are most often the ones with the best websites. The multiple factors that contribute to this outweigh the random, varied linking that would other occur through strong content creation alone.
In this scenario, perhaps Google “turns the other cheek” to many of these manipulative link acquisition practices, because, for all that they hurt many other verticals, they, here, in the areas of pornography, payday loans, pharmaceuticals and elsewhere – help inform a quality website more than content ever could.
A Problem For Webmasters
The dilemma here, of course, is that this reality would mean that Google would have to say one thing and do another. Their public statement against manipulative link practices would be a partial lie. This makes for an ethical conundrum for many webmasters, leaving those incapable of determining this reality for themselves at a gross disadvantage.
In a portal where they only – failingly – try to acquire content-driven links – they will not compete near close enough to a level needed to rank in these manipulative-driven verticals. These concerns are abound even in the higher efficacy verticals – so to say they play an even bigger part in verticals where “white-hat” acquisition is practically a myth would be for the most part, an understatement.
It’s also possible that Google is unaware of the inability of these verticals to acquire links naturally at a volume capable of accurately sorting the SERPs (unlikely), and still moves steadfast towards an internet completely devoid of manipulative link acquisition practices.
This, unfortunately, would likely leave many of these low content link efficacy verticals in even worse shape, and the people needing their services likely scammed or otherwise, unable to find what they’re looking for.
A final scenario exists where Google imagines these kinds of verticals – manually or algorithmically – just as they are – the “slums” of the SERPs, and largely ignores them. Few Google cop cars drive through (not even the robot cars) – and non-explicitly-black-hat spamming tendencies are deemed as OK – for the benefit of the long-term viability of the SERPs, and otherwise, intelligent time allocation for the webspam team.
In my eyes, I see this as the most likely situation, and one every intelligent webmaster competing in these verticals should be aware of. Clearly, every vertical and SERP is one that webmasters should take a hard, deep look at – and from there, adjust search strategy – and ethical stance – accordingly.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.