2014 SEO Playbook: Off-Page Factors

2014-seo-playbookAre you ready for 2014?

Today’s column marks the third and final entry in my annual SEO Playbook. Part 1 primarily focused on what Hummingbird will mean for marketers in 2014, especially as it relates to content and authority. Part 2 took an updated look at on-page SEO factors, including content, HTML and architecture.

In Part 3, I’ll discuss off-page factors that SEOs will need to consider as we enter the new year. Enjoy!

Links: Quality

There is a lot to be said about links. Google is going to continue its trend of getting more discerning and aggressive with penalties in 2014. Quite frankly, their link analysis keeps getting better and the confidence of the search and spam teams — that they will not unjustly targets innocent websites — keeps growing stronger.

One mantra from 2013 is that “link building is dead.” I wouldn’t go that far. There is a lot you can do to encourage links without resorting to artificial means or straight out begging. I see link building programs being folded into influencer marketing programs and becoming more networking oriented. In my opinion, there is nothing wrong with sending out an e-mail notifying your network about your new content, as long as the decision of whether to link or not is ultimately up to them.

Diversity — i.e., links from a variety of sources — is also important. If all of your links are coming from your network or the same websites over and over, you could be in trouble. You cannot put your content on autopilot and check off tick boxes in your editorial calendar. You really need to be actively promoting your content, enough to grow a real audience. When you do this, link diversity tends to take care of itself.

For websites that already have a lot of low quality links out there (and I would absolutely suggest an audit to find this out, especially if you’ve ever used a link building service), you have a difficult choice ahead of you if your site has not already been hit with a Penguin penalty. Do you engage in a link-cleaning program because you fear the future algorithm update may strike your site, or do you do nothing and wait out?

This can be a difficult judgment call, one where professional counsel is in order. I would suggest attempting to remove the most egregious links and engaging in a wider campaign if more than 40% of your links are of low quality. For full disclosure, 40% is not a scientific number; it is a guesstimate. Let’s just say that if I looked at the website and saw that 40% or more of its offsite links were low quality, my skin would start to crawl.

Even if you are not under a penalty, copiously log your link cleanup efforts. Should you be hit with a manual penalty in the future, this log can help to demonstrate that you’ve already made efforts toward rehabilitation and may speed up the reconsideration process.

Another concern is when too many offsite links use the same anchor text. This can occur quite naturally when other sites link to your pages using the article title or title tag. That is generally fine. The real concern comes from unnatural repetition of individual keywords or key phrases. To date, Google shows little interest in grandfathering old links, so be certain to include anchor text as part of your link spam analysis.

As for Google’s link disavowal tool, I would not bother with this unless you are certain the penalty is in place. If you are doing preventative off-site link rehabilitation, another reason to keep a log is so that you can populate the disavow tool quickly if you need to.

Links: Quantity

When it comes to links or domain authority or page authority, the old adage has always been “quantity and quality.” This will not change. If you are not earning new and better links at a faster rate than your keyword competitors, you’ll probably lose many ranking battles.

Links: Paid

There is not much I can say here other than do not purchase links in hopes of better rankings. If you have gone to a search engine optimization conference over the last year, you probably noticed that link sellers are disappearing from the exhibit floor. There is a reason for this, and it is because Google has their number. Just don’t do it.

Trust: Authority

Trust has really started to evolve as a search engine ranking factor, or set of factors. Old signals like domain age are less important, partially because they were never that meaningful to begin with and partially because search engines are able to put more faith in new and better algorithmic signals.

Now, in addition to links from high trust sites like whitehouse.gov or adobe.com, trust is more about things like brand recognition and author recognition. You can be certain that Google and Bing have a database of brands and an automated way to add new ones to the list. Brands are important and get a boost in the rankings — not because you know them, but because people write about them and link to them.

One of the best ways to build trust is to employ author and publisher tags on your content while encouraging your writers to be professionally active in social media. I am also a big fan of inviting or hiring trusted influencers to contribute and write for your company blog. I realize there is an ongoing debate about paid content, so let me be clear: I am not advocating purchasing paid content for the sake of getting stuff onto your website. I am saying seek out recognized experts to write amazing stuff for you and pay them what they are worth.

Another way to build author trust is to have a central author as a voice for your company blog. One person devoting their time to developing great content and promoting it in social media will go a lot further than having round robin contributions from everybody on your staff. Author trust is something to be developed, and as it grows, so does the trust given to all of their past articles.

Trust: Piracy

Is your content management system up-to-date? Most CMS updates include security patches to prevent takeovers and piracy. Do not fall behind.

If your server or website does get hacked or infected with malware taken off-line immediately and put up a 503 page. This lets the search engines know your site is temporarily off-line and will return shortly. If this happens and the search engines blocked your site to protect their users, do not go back online until you solve the problem then filed a reconsideration request. One of the golden rules is to not ask your paid advertising Google or Bing representative for help with nonpaid search. This is probably my one exception to that rule. Whether it will work or not is debatable. But after you fix your website, anything you can do to speed up the reinclusion process is worth doing. Besides, if you were using paid search and your site gets taken down for malware, you want your ads to begin working again once you fix the website.

Social: Connections & Interactions

The reality of social media as a search engine ranking factor has not met the hype created by the search engines and optimization professionals. To be sure, social media is a ranking factor and one that will continue to become more important. That said, social will not replace link authority any time soon, and it appears to be progressing slower than anticipated.

Social media metrics such as Facebook likes and shares or Twitter mentions and retweets have a high correlation with high rankings. But, as the search engine representatives like to remind us, correlation does not equal causation. Right now, this really is a case where popular websites and influencers are as likely to get links as they are social votes.

It is important to understand the relationships that search engines have with social media sites and be active on those sites. For example Google owns Google+. Bing has relationships with Twitter and Facebook. And of course, personalized results will continue to be influenced by social media connection. If a user has a connection to a person or brand, search engines will use social media connections to display relevant content.

Personalization: Country & Locality

International and local search results have been areas of focus for several years now. There are plenty of things to optimize for international and local results, such as proper use of subdomains or country code top-level domains, tagging pages with language codes and registering geographic targets in Google Webmaster tools, as well as registering businesses in Google+ and Bing Places for Businesses. Do not ignore these.

At the same time, it once more comes back to links. If you are getting links from sites related to the geographic locations you’re targeting, your website is more likely to break into local rankings for those places. Factors like IP address and server location will continue to become less influential as search engines get better at measuring user centric signals.

I think this is one area in which social media will eventually play a major role. For example, if many people in Portugal have a company in their Google+ circle,s that company maybe more likely to appear in the search engine result placements inside Portugal. That type of signal is a lot more meaningful than whether the server resides within Portugal or the website is written in the Portuguese language (something multiple countries use).

Personalization: History

Like social media, personal history is a ranking factor that is slowly coming into its own. Right now, if you are logged into Google or use Chrome and visit a web document, that page or site is more likely to show up in future search results. If social media friends visit a page, that document or site is more likely to show up in your future results. Based on personal experience, this is pretty fluid and seems to be one of those things the search engines keep evolving.

Going forward, it makes a lot of sense for search engines to give trust to webpages that lots of people visit, something they can evaluate by using the collected search history data stored in their databases.


Overall, all of the ranking factors boil down to quality, authority and trust. As search engines find new ways to collect data and become better at evaluating the data they already have, it makes a lot of sense that the algorithm will shift from easy-to-measure but less useful signals (like domain age or server location) toward more difficult to measure signals that are more telling (like visitor location and author trust).

For the last two years, we have been seeing that, thanks to Panda and Penguin, the search engines finally have teeth to put behind their policies and guidelines. Search engine optimization is no longer about technical tricks designed to outwit Google and Bing. It is about building an audience, earning trust, and publishing genuinely useful information that people want to consume.

Some call this a new age of search engine optimization. Others say it is the end of SEO and the Golden Age of inbound marketing. One thing is for certain, though: with our current technology, we have more data than ever before to tell us what is working and what is not. Ultimately, the winners are those websites and businesses that can accept the new realities and do something with them.

Obviously, I am a big content proponent because it is the basis for everything from keyword rankings to earning links and attracting influencers. At the same time, one size does not fit all. You must understand and execute what is best for your business given your goals and objectives. Just keep in mind that search engine rankings get earned because of great online marketing programs — and SEO does not create great online marketing.

Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.

Related Topics: All Things SEO Column | Channel: SEO | SEO - Search Engine Optimization


About The Author: operates Schmitz Marketing, an Internet Marketing consultancy helping brands succeed at Inbound Marketing, Social Media and SEO. You can read more from Tom at Hitchhiker's Guide to Traffic.

Connect with the author via: Email | Twitter | Google+ | LinkedIn


Get all the top search stories emailed daily!  


Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • James Perrin

    Nice post Tom. I think Personalisation is a really important factor, especially from a Content Marketing perspective. For the user to have tailored content based on where they are in the buying cycle is crucial to improving conversions. I understand your post was purely regarding SEO, but I think Content and Social are now so inextricably linked with SEO, it’s hard not to mention it. Good post.

  • http://wtff.com/ JustConsumer

    Tom, excuse me, but could you please clarify, why do you believe, that your suggestions are noteworthy ? I mean beside the fact, that you can write long, marketing style texts.

    I’m asking, because your own projects ( inboundbound.com and sam.ly ) don’t look successful at all. The first one almost doesn’t have traffic and the second one …… is it in pre-alfa test stage ?

    However you’re making statements like you tested your suggestions, got positive outcome and now can recommend to others.

    I suppose I just missed something. Where is it possible to see your successful online projects, that can prove the value of your statements and suggestions ?

    Thanks )

  • CrosbyTee

    So if I have a link on my site to xyz.com how does Google tell if it’s paid for or not?

  • Thomas Schmitz

    Sure. My experience comes from working with big brands, media companies and start-ups at Portent for over 5 years then on my own. Sam.ly is a personal click-board I tossed together one day to replace iGoogle, that I then made available as a free tool. And yes, InboundBound needs loving. Most of my writing is here on SEL.

  • http://wtff.com/ JustConsumer

    Thank you for reply.

    Do I understand correct, that in 2013 you didn’t work “with big brands, media companies and start-ups at Portent” ?

    Could you please specify projects you worked with on your own in 2013 ?

    Don’t get me wrong, I just want to see the proof of your experience in real life, before taking your suggestions seriously.

  • Thomas Schmitz

    Google will look for clues. It can be as simple as the word “Sponsored” above the link, suspicious behavior like having randomly placed or out of context anchor text, or as subtle as matching a statistically driven profile developed from examining thousands of sites.

    Google does not say much about how it does this, at least not beyond the obvious. It’s secret sauce. And while it’s easy to credit Google with omniscient powers they do not possess, the reality is the Web Spam Team is good at this and getting better.

  • http://www.joshclosser.com/ Josh Closser

    Use your real identity….

  • http://wtff.com/ JustConsumer

    Reply by Thomas Schmitz posted below is wrong.

    Google doesn’t care if the link is paid or not.
    Google does care about the PageRank passing.

    Here is the absolutely precise reply to your question from Google itself :
    ” To address the issue, make sure that any paid links on your site don’t pass PageRank … make sure that any paid hyperlinks have the rel=”nofollow” attribute ”


    Feel free to have as many paid links as you want on your website. But no PageRank passing.

  • http://www.eyewebmaster.com Rosendo A. Cuyasen

    I believe Google has its own tools to identify links from suspicious website. We have some website who has back links problems and we’ve taken those bad links but when we try to submit again to them they told us that there are still back-links that are not appropriate for that site. Funny and thinking how they get those bad back links..

  • http://www.eyewebmaster.com Rosendo A. Cuyasen

    I believe Google has its own tools to identify links from suspicious website. We have some website who has back links problems and we’ve taken those bad links but when we try to submit again to them they told us that there are still back-links that are not appropriate for that site. Funny and thinking how they get those bad back links..

  • http://ignitevisibility.com John E Lincoln

    Great post I totally agree with your points.

  • Neeraj Kumar

    for my poem blog i have got about 4500 links which come from blogs or webstes where my blog or blogfeed is listed.
    My concern is when I post a new poem it creates many new such links. Would they be counted as spamming.


Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest


Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States


Australia & China

Learn more about: SMX | MarTech

Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!



Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide