Matt Cutts On Penalties Vs. Algorithm Changes, A Disavow-This-Link Tool & More

smx-logo-128Is it a penalty? Or is it just a change to Google’s algorithm? That’s been one of the hot topics in search marketing in recent months thanks to the Panda and Penguin updates, and it was one of the topics of discussion tonight at our SMX Advanced conference in Seattle.

During the annual “You & A with Matt Cutts” keynote session, Google’s web spam chief told Search Engine Land Editor-In-Chief Danny Sullivan that Google’s definition of a “penalty” is when manual action is taken against a site — and that Google doesn’t use the term “penalty” as much as they say “manual action.” Cutts went on to say that neither Panda nor Penguin are penalties; they’re both algorithm updates.

He also mentioned — and this will be good news to many search marketers — that Google is considering offering a tool that allows web masters to disavow certain links, but that may be months away if it happens.

Other topics included why some spam reports aren’t acted on, whether Google+ and +1 votes are a strong SEO signal right now and much more. We’ll have separate coverage of those topics in future articles, but for now you can read my full (and largely unedited) live blog below.


We’re just moments away from our annual “You & A with Matt Cutts” keynote at SMX Advanced in Seattle. The room is packed like sardines in a can and, with all of the recent Panda and Penguin news buzzing around the search marketing industry, this conversation should be interesting, to say the least.

Search Engine Land’s Editor-In-Chief Danny Sullivan will be handling host duties, and I’ll do my best to keep up with the discussion below. So, stay tuned, hit your Refresh button every few minutes if you want, and follow along with all of us here in Seattle.

So we’re actually starting out with that hysterical video by Sam Applegate in which Matt Cutts explains how to rank number one on Google:

Danny and Matt have arrived to a penguin-filled stage and we’re getting started. And Matt has just thrown one of the stuffed penguins right at me, nearly taking my head off. But he missed, which is proof that he’s better at fighting web spam than at throwing stuffed penguins.

Danny: What’s the deal with Penguin. Is it a penalty?

Matt: We look at it something designed to tackle low-quality content. It started out with Panda, and then we noticed that there was still a lot of spam and Penguin was designed to tackle that. It’s an algorithmic change, but when we use a word like “penalty,” we’re talking about a manual action taken by the web spam team — it wasn’t that.

We don’t think of it as a penalty. We think of it as, “We have over 200 signals, and this is one of the signals.”

DS: So from now, does “penalty” mean it’s a human thing?

MC: That’s pretty much how we look at it. In fact, we don’t use the word “penalty” much, we refer to things as a “manual action.” Part of the reason why we do that breakdown is, how transparent can we be? We do monthly updates where we talk about changes, and in the past year, we’ve been more transparent about times when we take manual action. We send out alerts via Google Webmaster Tools.

DS: Did you just do another Penguin update?

MC: No.

Danny references the WPMU story and Matt says that the site recovered due to the data refreshes and algorithmic tweaks.

DS: Now we hear a lot of people talking about “negative SEO.”

MC: The story of this year has been more transparency, but we’re also trying to be better about enforcing our quality guidelines. People have asked questions about negative SEO for a long time. Our guidelines used to say it’s nearly impossible to do that, but there have been cases where that’s happened, so we changed the wording on that part of our guidelines.

Some have suggested that Google could disavow links. Even though we put in a lot of protection against negative SEO, there’s been so much talk about that that we’re talking about being able to enable that, maybe in a month or two or three.

DC: asks about different types of links

MC: We’ve done a good job of ignoring boilerplate, site wide links. In the last few months, we’ve been trying to make the point that not only is link buying like that not doing any good, we’re turning the dial up to let people know that certain link spam techniques are a waste of money.

DC: Danny asks about messaging.

MC: If you roll out a new algorithm, it can affect millions of sites. It’s not practical to notify website owners when you have 500 algo changes every year, but we can notify when there’s been manual action against a specific site.

One thing I’d like to clear up — the news earlier this year about 700,000 warnings. The vast majority of those were because we started sending out messages even for cases of very obvious black hat techniques. So now we’re completely transparent with the warnings we send. Typically your website ranking will drop if you don’t take action after you get one of those warnings.


DC: Anything new related to paid links?

MC: We’re always working on improving our tools. Some of the tools that we built, for example, to spot blog networks, can also be used to spot link buying. People sometimes think they can buy links without a footprint, but you don’t know about the person on the other side. People need to realize that, as we build up new tools, paid links becomes a higher risk endeavor. We’ve said it for years, but we’re starting to enforce it more.

I believe, if you ask any SEO, is SEO harder now than 5-6 years ago, I think they’d say it’s a little more challenging. You can expect that to increase. Google is getting more serious about buying and selling links. Penguin showed that some stuff that may work short term won’t work in the long term.

DS: Affiliate links. Do people need to run around and nofollow all that?

MC: If it’s a large enough affiliate network, we know about it and recognize it. But yes, I would recommend no following affiliate links. (That’s a paraphrase! Not an exact quote – sorry.)

DS: Do links still work, or are social signals gonna replace them?

MC: Douglas Adams wrote “Space is big. You have no idea how big space is.” The web is like that. Library of Congress, the biggest library in the world, has 235 terabytes of data. That’s not very big compared to the way the web grows.

The actual percentage of nofollow links on the web is a single digit percentage, and it’s a pretty small percentage. To say that links are a dead signal his wrong. I wouldn’t write the epitaph for links just yet.

DS: You do these 30-day challenges, like “I’m gonna use Bing for 30 days.”

MC: I have not done that one, and I’m afraid to try! (huge laughter from audience – Matt then says he’s joking and compliments Bing team)

Danny challenges Matt and Google to do something to see the web from an SEOs shoes, and says that SEOs should try to see things from Matt’s perspective, too.

DS: What’s up with your war on SEOs? (laughter) Or is it a war on spam?

MC: It’s a war on spam. If you go on the black hat forums, there’s a lot of people asking, How do I fake sincerity? How do I fake being awesome? Why not just be sincere and be awesome? We’re trying to stop spam so people can compete on a level playing field. I think our philosophy has been relatively consistent.

DS: What about tweets earlier today about using bounce rate? You don’t look at how quickly someone bounces from a search result and back to Google?

MC: Webspam doesn’t use Google Analytics. I asked again before this conference and was told, No, Google does not use analytics in its rankings.

And now we’re going to audience questions.

DS: What percent of organic queries are now secure?

MC: The launch was a little backwards, because we didn’t want to talk about being able to search over different corpi/corpuses. It was a single percentage of traffic in the US, and then we rolled it out internationally.

I think it’s still a minority of the traffic now, but there’s things like Firefox adding SSL search in the browser. There’s a lot of things aimed at helping users with privacy. I recognize that’s not good for marketers, but we have to put users first. We feel like moving toward SSL, moving toward encrypted, is the right long-term plan.

DS: (reading audience question) How come WordPress didn’t get penalized with all the blogs that have WordPress links in their footer?

MC: If you look at the volume of those links, most of them are from quality sites. WPMU had a pretty good number of links from lower quality sites.

DS: How come AdWords isn’t being blocked from keyword referrals?

MC: If we did that, every advertiser would do an exact match for every phrase and then the ad database would grow exponentially. He adds that he wishes Google might have reconsidered that decision, though.

(I missed the next question.)

Matt explains that web spam team has been working together with search quality people and other groups. He’s using it to further explain different between penalty and algorithm adjustment.

DS: So we have positive ranking factors and negative ranking factors?

MC: Yes.

DS: asks question about rich snippet spam

MC: Used to be that people wondered why it was so hard to get rich snippets, now it’s the other way around. We’re looking at ways to handle the abuse … missed the exact quote, but he said something about maybe removing ability for a domain to have rich snippets if there’s abuse.

DS: asks question about link removing after getting an alert in Webmaster Tools

MC: We want to see an earnest effort to remove the links. When you do a reconsideration request, we’ll look at some of the links and see “how much progress have they made?” We’ve talked about the idea of adding a disavow-this-link tool.

DS: What if you can’t get rid of bad links pointing to a page, should we get rid of the page?

MC: If it’s not an important page, you could. Or you could at least document the effort to remove the links and share it with us.

DS: What percent of spam reports does your team take action on?

MC: We have a good list of leads ourself. We’ve shut down tens of thousands, maybe hundreds of thousands of domains involved in link buying. When you get a spam report, you want to take action, but it may not be as high impact as doing something about one of our own leads. We use a factor of four — we measure the potential impact by four and if it still shows up near the bottom of the list, we may not take action on it.

DS: asks question about Google+ and SEO

MC: When we look at +1, we’ve found it’s not necessarily the best quality signal right now.

DS: You have to be on Google+ if you want to rank well in Google.

MC: No!!!! It’s still early days on how valuable the Google+ data will be.

DS: Why’d you call it Penguin, by the way?

MC: For Panda, there’s an engineer named Panda. For Penguin, we thought the codename might give away too much about how it works, so we let the engineer pick a name.

DS: If you were hit by Panda and Penguin, should we just give up? (audience roars with laughter)

MC: Sometimes you should. It’s possible to recover, but if you’re a fly-by-night spammer, it might be better to start over.

DS: What’s the deal on paid inclusion? Is it coming to web search?

MC: You call it paid inclusion, but it’s a separately labeled box and it’s not in web ranking. Google’s take on paid inclusion is when you take money and don’t disclose it. Google’s web rankings remain just as pure as they were 10 years ago. We have more stuff around the edges, that’s true, but that stuff is helpful. Matt mentions using Google Flight Search to book his trip here to Seattle. “You can’t buy higher rankings. That hasn’t changed. I don’t expect it to change.”

DS: Mentions that some people have been really mean to Matt recently.

MC: I’ve had a lot of people yell at me over the years. I’ve developed a thick skin. People aren’t striking out because they’re vicious, they’re striking out because they’re hurt or they believe Google isn’t doing the right thing. You want to listen to that. Some of our best launches have come from some of the most passionate criticism.

DS: What are you most excited about right now in search?

MC: I like some of the stuff we’re doing that hasn’t launched yet. I do like the Knowledge Graph a lot. I’m really excited that we’re pushing for more transparency. If you’d told me 10 years ago that we’re going to tell every spammer when we catch them, I would’ve said you were crazy.

And with that, we’re done. Thanks for tuning in!

Related Topics: Channel: SEO | Features: Analysis | Google: Disavow Links Tool | Google: Web Search | Live Blogging | Top News


About The Author: is Editor-In-Chief of Search Engine Land. His news career includes time spent in TV, radio, and print journalism. His web career continues to include a small number of SEO and social media consulting clients, as well as regular speaking engagements at marketing events around the U.S. He recently launched a site dedicated to Google Glass called Glass Almanac and also blogs at Small Business Search Marketing. Matt can be found on Twitter at @MattMcGee and/or on Google Plus. You can read Matt's disclosures on his personal blog.

Connect with the author via: Email | Twitter | Google+ | LinkedIn


Get all the top search stories emailed daily!  


Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • Tarjinder S. Kailey

    When Matt Cutts said “We’re trying to stop spam so people can compete on a level
    playing field” I am pretty sure that lots of people (especially those who
    thought that Penguin is an anti-small business) can breathe easier now.

  • Raviraj Tak

    I have a question when the penguin update was launched and many of the sites were badly affected from it and lots of sites using spam techniques were in top searches; even sites which were stuffing keywords in their title’s were also seen in top searches. Was this a good penguin update. If this is what the penguin wants to do is than i guess there would be a lot of spamming coming in the near future.

  • sithurajkumar

    “You don’t look at how quickly someone bounces from a search result and back to Google? .. That’s a good question by danny. Search engine results should satisfy the searchers not the SEO’s…

  • dnyaneshwar ware

    After penguin update it is true that many of the websites ranking in the search result had keyword stuffing in their titles but if you observe carefully the link profile of each of these websites ranking in the search result then you may come up with a conclusion that these websites did not have backlinks with an exact match anchor text in their titles, keyword and nor description.
    So these websites got advantage of doubt from penguin update.
    One such good example is which has very beautifully created their backlinks. These is one of the biggest website with multiple sub domains and they created backlinks with anchor text as  ” resume samples” and a number of variation in anchor text alongwith domain name. They are ranking well in SERP’s.

    So we should focus on building a good link profile and make sure that majority of the backlinks are not coming from one specific keyword.

    Right now. Guest posting is the only solution for building quality links with branding URL.
    Need to work on balancing your link profile graph.

  • Alan

    You are either joking (which I hope you are) or you are truly misguided!

  • Stacey Cavanagh

    Interesting point on potentially allowing Webmasters to disavow certain links. Does that mean that Google would potentially make ALL link data available to use in Webmaster Tools, I wonder? As, at present, there’s only a select amount in there.

  • Sean Fullerton

    Some very interesting points…surprised to hear him say Google does not use Analytics for their rankings, I was sure I’d seen a lot of growing signals to that effect recently. 

    Thanks for the updates Matt you kept up to speed very well! 

  • Blue-eyed Gal

    Pointless pedantry: Dear Matt; just use corpuses, like campuses; it’s okay to anglicize Latin words. Despite the ending, corpus is a third declension neuter noun, so the original Latin plural is corpora not corpi, which sounds like a cross between a porpoise and one of QE2′s corgis.

  • Anton Koekemoer

    I almost fell off of my chair watching that video. 

  • MK Safi

    Link disavow tool? What’s this BS coming to? Will Google please get some new creative blood to run their antispam dept?

  • benlanders

    Love this one, “why not just be sincere and awesome?” Great question! Want big muscles? Why not just workout hard and eat right? Want to win the Tour de France? Why take EPO? Why not just ride 100 miles per day, every day for 10 years? Answer: Because it’s hard!!!

    Most people are averse to hard work (which is unfortunate for them, but a blessing for any company committed to “doing it right”).

  • Peter Kern

    Of course penguin is to remove web spam sites that’s why aged decent websites/businesses had been removed from the top results. Instead there are ‘better’ results from amazon, qype, etc and… new fresh rubbish websites with duplicate content. WELL DONE!

  • Peter Kern

    … and you believe in anything they tell you?

  • Peter Kern

    So… if your website suffered because of the penguin you have to build a new one. Forget about building your website for years. It is all gone. I don’t believe in recovery if you change few things. What can you change if your website complied with google guidelines?

  • robthespy

    The anti-spam department is every division in Google which makes $$$.

    It’s a heck of a lot easier, cheaper and more profitable to simply remove/limit organic listings for competitive and commercial SERPs.

    As always, the user will dictate the success of Google. In the meantime, Google will make it as difficult as possible to avoid them.

  • robthespy

    It’s not just about working hard. Im guessing that many site owners hit by Penguin, panda, Skunk, [insert black and white animal], had no idea that what they were doing was not good practice. And many of them worked very hard and spent a lot of money.

    Yes, ignorance is not an excuse. But IMO- you can’t call those hit by updates “lazy” or accuse them of taking the easy road.

  • robthespy

    It’s not just about working hard. Im guessing that many site owners hit by Penguin, panda, Skunk, [insert black and white animal], had no idea that what they were doing was not good practice. And many of them worked very hard and spent a lot of money.

    Yes, ignorance is not an excuse. But IMO- you can’t call those hit by updates “lazy” or accuse them of taking the easy road.

  • Nick Stamoulis

    “that Google is considering offering a tool that allows web masters to disavow certain links”

    That would make a lot of site owners breath a lot easy and put any fears of negative SEO to rest. It would be incredibly useful for SEO providers that take on new clients because we could go in and undo any black hat work down by previous firms and get a fresh start.

  • Bhupendra Shekhawat

    “Google does not use Analytics for their rankings”…what they will be use…..????

  • RyanMJones

    I wish Matt would have clarified more on negative SEO. I’m pretty sure his definition differs from that of the audience.  He mentioned the disavow tool because people are requesting it, but he seemed to hint that it’s not needed.

    I understand the legal reasons why he can’t say if negative SEO is possible just by building bad links, but if it’s not – or not very likely, I wish he’d come out and say that to put people at ease.  

    The general consensus seems to be that its’ very easy to do and very possible, and after trying to get one of my sites penalized I can say it’s not that easy.  Simple xrumer and scrapebox blasts aren’t enough to do it. All I’ve managed to do so far with those is increase my rankings.

  • jemois

    I really think Google is doing a good thing in creating a tool to disavow certain links. I think the main idea is that you can’t hurt your competitors by adding spammy links to their website but also to get rid of the unwanted links to your domain. 

  • Johndx

     Looking forward to running through a list of 5,000 links to disavow them one by one? The easy thing is to not allow link penalties at all.

  • Peter Kern

    Of course negative SEO is possible. If the websites were penalised for over optimisation than do it for your competitors. Simple as that. How would Google know who is doing it?

  • Johndx

    Wait and see what happens.

    Let’s put it simply: it’s very, very easy.

  • Phil

    Has anyone tried the new Remove’em tool that just came out as a way to identify and remove bad backlinks?

  • Tien V Nguyen

    I wasn’t entirely clear on that part, it seemed like it was more of a “submit URLs that you don’t want to be associated with”, rather than “here’s a list and check the ones you don’t want to be associated with”. 

    Of course that opens the possibility of abuse up where Google catches you with a bad link, so you just say “oh that’s not actually mine”.

  • Jeff Kean

    Way to completely ignore the tough question instead of asking matt when you had the chance, another top showing from Danny Sullivan

  • Matt McGee

    Hi Stacey – they would really only need to show the links that they consider suspicious to do this, not every link pointing to a site/page.

  • Brian Carter

    Sometimes I open 5 of the top 10 results so I can read them all. Would that be considered bouncing back to Google? I am interested in them all.

  • Webstats Art

    Google are to blame for this. They admit that they let them (black hat) get way with cheating in the past. Billions of dollars have been lost by good companies over the years because Google search was so easily exploited and companies who made the best products and had great people died because of the SERPS. How can you blame people who started out well but ended up buying links? Those same people saw competitors getting ahead through SEO and they had no choice.
    What is very wrong with google is that they started bringing  all these updates all at once in early 2011 when they should of done it gradually.

    Of course it is easy for google employees to smile because they have such power that they can influence a company’s future by dropping their web presence overnight. Those companies will be happy when a technology comes to knock google on the head.. and maybe, just like Panda – it will happen overnight.

    Do you all remember when we were all upset with Microsoft? At that time, we never imagined that someone else could challenge them. It is the same with Google. Something can change and the internet is evolving too fast for anyone to prepare for what is coming next.

  • Casey Dennison

     Exactly…anchor text is the signal their using and there were tons of sites that slipped through the filter. People don’t get get.

  • John Kent Williams

    Back-Links have become non-sense and abused and misused …and just non-sense.  Just take the silly use of backlinks out of the algo…done…no more contraversy, no more bad SEO etc.  What happended to content being KING. Don’t user want to use and see content ?

  • jemois

    Google didn’t build the tool yet, they will develop it in the next months

  • jemois

    They want to stop the link building madness. From all the links you get maybe some of them are good and you want to keep them, so even if there are 5000 is better so look through them and keep only the ones that count. Its better then being penalized anyway :)) I think its pretty easy to “see” the spammy ones.

  • David Johnstone

    For an SMB and their target market, it’s not about creating an awesome brochure (website), it’s about providing an awesome product or service. However, Google’s algo is not clever enough to judge the quality of a product or service.  And so this meaningless quest for “being awesome because Google tells me to” is referring to CONTENT only.  It’s like saying a company is awesome because their brochure is awesome.  NO.  A company is awesome because their product or service is awesome.  End of story.  In e-commerce, a searcher wants to find an awesome product or service – the brochure (the website) is just the shop front – nothing more.  It’s because of Google’s weakness for ranking content based on content-winning-links that we’re all playing this charade of trying to create the best brochure (that searcher’s largely do not care about – they care about your service or product).

  • David Johnstone

    Exactly – I know a lot of small businesses that got hit hard (by Penguin) and they provide a quality product / service.  They can’t just start over – their print materials have their URL, their whole legacy of doing business was married to the URL that’s now punished.  So Matt Cutts says “start over” – it’s like telling a bricks and mortar shop to move location and rebrand itself just because a 3rd party company made a decision against it.  Even worse, Matt Cutts admits that we can’t even fully protect ourselves from negative SEO, so why even bother to start again, just to be knocked down by a competitor? Crazy situation that is stifling SMBs at a time we least need it.

  • Danny Sullivan

    The tough question on negative SEO? Or paid inclusion? Or if they’re losing the war against paid links? Or would he put himself in a publisher’s shoes for a month and try building links that are supposedly so easy to get? Oh, wait, I did ask those and more.

  • Danny Sullivan

    Please rent Repo Man. Fast forward to the scene where the punk tries to rob a store, gets shot and says society is to blame. Yes, people have no choice but to buy links. People have all types of choices. Those who chose not to do it may have given up short term gains and avoided long term woes.

  • Farky Rafiq

    Do you think that CTR has an impact on rankings as its displayed within WMT I cant help but think that if its low on the Serp its worst then a bounce because the page isn’t relevant to the query

  • Alan

    Danny you asked “DS: Did you just do another Penguin update?
    MC: No.” You should have asked is there any update being run at the moment? You would have had a very different answer I am betting. Because webmaster chatter is as high as any other time during the last 3 months.

  • Ian Smith

    Lots of detailed analysis going on concerning what aspects of the algorithm have changed. But one of the biggest effects does seem to be an elevation in rankings for the large national and global companies. Small businesses definitely seem to be getting hit most.

  • Danny Sullivan

    We actually already asked Google earlier that day if they’d done a Penguin Update, a Panda Update or any other type of major update and were told no. That’s why we didn’t write a story confirming any of that. Asking him about the Penguin Update was really a joke, because we’d just asked and see to be asking every few days now.

  • Alan

    Don’t joke about the penguin!! Just joking! However I still believe something is happening out there and that Matt is not telling the full story. Then again it is his job to not tell the full story. Although another possibility is that he may not know the full story. One thing I have realized in the last few months of watching Google closer than normal is that Google’s disparate sections are not always completely aware of each others activities. In the immortal words of GW “We now know what the left hand, is doing with the right hand” when he obviously didn’t know. 

  • dnyaneshwar ware

    Well, Matt Cutts is very clever and knows that the penguin update has given room for negative SEO and to tackle that very same problem, they went on to develop a tool called A Disavow-This-Link Tool. 
    Matt Cutts will not reveal the full story at this time but picture will become more clear in the coming months.
    I am sure this algorithm has affected many websites which have spammy link profile, which means that any website that has too many backlinks coming from handful of targeted keywords and is surely going to get hit by this algorithm. 
    This algorithm has given more importance to brand name of the website.
    As such links with direct brand URLs(staring with http and www) and brand name(Domain Name) will be of much importance. 
    And Finally the keywords in the title, description and keyword density of the page will play a major role for ranking high in the SERPs.

    I hope all of you are getting my point.

  • jasjotbains

    I dont see Penguin (or Panda for that matter) stop spam. They just messed up rankings of sites with bad links, and help sites with good links and crappy content come on top

  • Divi Fernando

    My question is – Why did Matt Cutts bring stuffed Penguins to the Conference?

    They are not a comforting soft toy to hold anymore :P

    Good to see Matt Cutts finally confronting questions about the Penguin update after a long haul of mysterious silence on the topic. Although not much is revealed, there have been certain useful disclosures thanks to the frank nature of Danny’s questions.

  • JassyPets21

    MC admitted it’s possible to perform neg seo, which is why google changed it’s language about the subject. “Our guidelines used to say it’s nearly impossible to do that, but there have been cases where that’s happened, so we changed the wording on that part of our guidelines.”

    If they thought it was still “nearly impossible”, then they wouldn’t have changed the language.

  • JassyPets21

    I think a tool that allows site owners to disavow links is an AWFUL idea. Small business owners work hard. Now you want them to comb through WMT and GUESS at which links MIGHT be providing negative value, and disavow them? We know that WMT does not display all links anyway. What do you do about those? How do you find them? How do you take them down? How do you disavow them?

    The solution is VERY easy. Do not pass negative value from links. And do not penalize for over optimization of anchor text. This solves negative seo.

  • Webstats Art

    Hi Danny,

    Short term gains can be millions upon millions for a business. All those link buyers wanted was a window to become popular through exploiting Google. Once they reached a certain point, they did not need google anymore because they made a brand by exploiting google. People have stolen technology, made products and sold through the internet. When they made their millions, they all of a sudden become legit. Of course they have money to pay the best people to provide their panda/penguin resilient content.

    This is a repeat story of the history modern gangs and of course the mafia.

  • Wouter Kiel

    “DS: [...] You don’t look at how quickly someone bounces from a search result and back to Google?MC: [...] No, Google does not use analytics in its rankings.”
    That’s a clever answer. MC says nothing about using G-bounce or whatever, which  surely must be a ranking signal.They’re already using it to enable domain blocking in the serps. 

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest


Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States


Australia & China

Learn more about: SMX | MarTech

Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!



Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide