The Social Graph: Revolution or Evolution?
Following Danny Sullivan’s interview with Google and Bing that confirmed social media reactions are a ranking factor with both search engines, it’s easy to get excited about social media being the new big thing in search engine optimisation (SEO). Others are less impressed, not convinced that this will change anything in the real world, especially in […]
Following Danny Sullivan’s interview with Google and Bing that confirmed social media reactions are a ranking factor with both search engines, it’s easy to get excited about social media being the new big thing in search engine optimisation (SEO).
Others are less impressed, not convinced that this will change anything in the real world, especially in regard to traditional organic search results. And not just because of the statements from Google that tweeted links currently only pass value in “limited situations” and that social buzz is a ranking factor “especially for news”.
In earlier internal discussions about these interviews, Receptional’s CTO Andy Langton wrote “There’s nothing particularly unusual about “social”. Is Wikipedia social? Amazon reviews? Forums? To me, they’re just sites that need to be evaluated, just as places like myspace have been for years.”
I feel he raises an excellent point. On more than one occasion there has been discussion in the SEO community over whether nofollowing of external links by Wikipedia is treated differently to other sites. Generally, when a site or page is linked from by a Wikipedia article, it’s a strong signal it’s a useful resource, even if these links can’t be policed 100% of the time.
It makes a fair amount of sense that Google, in the business of delivering quality search results, would not ignore these signals, especially as it could be argued that Wikipedia is a more established online entity than Twitter, having been around for a lot longer and continuing to attract significantly more unique visitors according to Compete and Alexa data.
Are Twitter and Facebook just other sites that are/may be treated specially by search engines, the idea of a social graph as a current reality being a trendy spin on something has existed in some form for a while, or are they a completely new type of influence on organic results?
Before Twitter and Facebook, for a number of years the vast majority of social online discussion was carried out on various bulletin boards and forums. Google can identify discussion results of this type, and it seems plausible that links from such pages could be treated differently to the rest of the web.
The same could be true of links in blog comments. If Akismet can identify which comments are spam, would it not be conceivably possible for Google to do so too? After learning of their reasonable surfer patent, granted earlier this year, we know that spiders can identify different areas of content on a page and process them by different standards.
Historically, we don’t know how Google has treated links from sites like Digg, Reddit and Del.icio.us, many of which were around before Twitter and Facebook. They could be handled the same as any other website, but as social bookmarking became more and more significant in shaping the content people view and consider to be notable, does it not stand to reason that someone at Google would have considered factoring the role of these sites into ranking algorithms?
Another question: If the Usenet was still popular today, would it be considered to be a form of social media? Today, newsgroups are mostly dead, full of spam and the vast majority of traffic is binary files. But for a long time it was the equivalent of Twitter, the biggest and best venue for social and professional discussion and interaction online, covering a huge variety of topics and hosting public debates on significant matters such as the early development of the Internet.
Arguably, the first serious discussion into implementing archival and retrieval of public online information was via Usenet; in early 1985 Chuq Von Rospach posted an RFC for a “usenet article archive program with keyword lookup”.
Social discussion online in itself is obviously not new. Usenet used to be a particularly social platform, distinguished from walled off forums by being decentralised and entirely public. The same metrics used to grade the value of Tweets and Tweeters could be used in any other public arena of social discussion where links or their equivalent are shared, presuming that individual contributors can be identified (which would admittedly be less clear on Usenet than Facebook and Twitter).
There’s no reason search engines couldn’t already have been using methods like these to determine quality of article submissions and submitters on, say, Digg for a long time already, but unless a specific announcement was made the SEO community would be in the dark.
Factual statements from Google on ranking factors as seen in Danny Sullivan’s well-conducted interview are a rare thing. No SEO knows all the intimate details of Google’s ranking algorithms, either currently or historically. I think many in web marketing mistake this information void as evidence that ranking algorithms are simpler than they likely are in reality, and see info like this as announcement of something that’s new.
But we don’t know how long Google have been assessing Tweets as a ranking metric, and we also don’t know if it’s only a compliment for similar, established processes used with other individual sites and/or types of user generated content.
Whatever the case, it won’t change the fact that producing unique, compelling content and marketing it well is the simplest strategy to attracting online traffic.
Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.
Related stories