SMX Advanced: You&A Conversation With Matt Cutts (Pandas, Penguins & Payday Loan Spam)

Day one of our SMX Advanced show is almost over, but there’s one last session still to come: the traditional “You&A with Matt Cutts” keynote session. Search Engine Land’s founding editor, Danny Sullivan, will be sitting down for a lengthy chat with Matt Cutts, the head of Google’s webspam team. The session is due to […]

Chat with SearchBot

smx-logo-128Day one of our SMX Advanced show is almost over, but there’s one last session still to come: the traditional “You&A with Matt Cutts” keynote session.

Search Engine Land’s founding editor, Danny Sullivan, will be sitting down for a lengthy chat with Matt Cutts, the head of Google’s webspam team.

The session is due to begin at 5:00 pm, and I’ll be liveblogging it right here as fast as I can. Given all the recent Google-related news — Penguin updates, Panda updates, actions taken against link networks and more — it should be a lively discussion.

See you at about 5:00 pm PT!

We’ll be starting soon. While we wait, the stage has been decorated with animals that start with the letter “P” — a pig, pug and polar bear.

smx-animals

And we’re underway with a packed house, starting with Danny asking if we use the NSA PRISM program, can we finally see Google’s “not provided” keyword data.

HUGE roar of laughter from the audience.

Danny reference the three animals and canvasses the audience about which animal they think will be the next Google update, and shows his new “Matt Cutts Debunking Flow Chart” t-shirt.

DS: What’s going on with Panda. You said you’d stop confirming them. How many updates have there been since then.

MC: We had one about a month-and-a-half-ago. We haven’t updated it since then because we’re looking at pulling in a new signal that might help pull some people out of the gray zone.

Gives some background on Panda updates: Panda update 1 was really large. As you do more, the updates get smaller and get into a steady state where you bake the data into the index. In an average month, you might expect that we’re rolling out new Panda data about 10 days — about one-third of the month Panda is happening.

DS: So we’re kinda like at 26?

MC: –ish.

DS: Why not just update them and announce them?

MC: We used to do that. By the end of the year, people were like … okay, 53 Panda updates, can we stop talking about it now?

We do more than 500 algorithm changes every year. It’s always difficult to assess what to share. With Penguin 2.0, we knew that a lot of people would be affected, so we wanted to get the word out.

DS: We have a lot of tools out there that try to give weather forecasts. We ask and you tell us sometimes there’s something happening and sometimes not. Are these tools crazy?

MC: It’s not that the tools are crazy, but there is a lot of sampling and sub-sampling skew. We’re rolling out a change that will affect about 3-5% of queries, which isn’t much, but on the black hat forums people are already talking about it.

DS: Asks about the Penguin 2.0 update.

MC: Penguin 1 affected entire sites. Penguin 2 can impact individual pages. (Matt’s note: not sure I got the right exact wording there. Sorry.)

The change we launched earlier today was targeting things like payday loans on google.co.uk, and we have more stuff launching in a couple weeks.

Matt asks how many people are in-house SEOs – huge amount of people raise hands.

He reminds people that were doing illegal stuff for rankings — using payday loans niche as an example — that Google warned long ago that it wouldn’t work forever. Tells audience that they shouldn’t assume black hat things will always work just because they may see people saying it’s working now.

DS: We’ve had a back-and-forth on links. I’ve characterized them the fossil fuel of signals. I thought my head was gonna explode with the disavow links tool — I got a link removal request on Tuesday for a link on Search Engine Land.

Why don’t you just disavow all the bad links yourself?

(Audience claps)

MC: An SEO asked me why he had to clean up all these messy links that someone else got. I said to him that it’s like a one-time market correction. Everyone should look up the rant Danny did at SMX Advanced last year about how you should want to get quality links, not easy links.

We are going through a transition, but we’re moving to a healthier world where it gets harder to spam every year. The disavow tool is there to help people clean up that mess when they can’t get bad links removed.

DS: Now links have been devalued.

MC: It’s definitely the case that now, compared to 6-7 years ago, fewer people are likely to think they have to buy links to succeed. It costs effort to make a great site. That’s always the intent.

I’ve had black hat SEOs write me and say they can’t do it anymore because it’s not a sustainable income. Our guidelines have always been kind of steady – make a great site so that people want to link to it. Now we’re bringing tools to bear that make that real.

matt-danny

DS: How do we know what links count anymore?

MC: Your rant last year was good. A press release link — you’re paying $100 for a link. That shouldn’t count. But making a good site that earns links is what you want. One kind of link will stand the test of time and one won’t.

Matt points to Apple as a company that focuses on great experience for users, and they’re doing well. Encourages audience to think that way.

DS: Give us an update on penalties and how they’re working now.

MC: At the reception last night, an attendee didn’t know this, so I need to reiterate: Algo updates happen 500 times a year and we don’t message webmasters on those. But if we’ve taken direct manual action that will affect your site, you will almost always receive a message in Google Webmaster Central.

One thing that’s new is we’re testing the inclusion of example URLs — when we send a manual action notification, we’ll include one or two or three sample URLs to show what’s wrong.

(Audience applauds)

DS: What’s the maximum penalty?

MC: If a domain is completely awful and we think there’s no redeeming qualities to it, we might set penalties to not expire until the domain itself expires.

DS: When they expire, you go back and review?

MC: Right now, the penalties automatically expire. All other things being equal, the rankings will then come back. Google assumes their spam fighting is good enough that, if the domain breaks rules again, it’ll get caught (again).

Next topic: Google’s smartphone news from today.

MC: You really need to be thinking about mobile. We’re starting to think a lot about mobile. We’ve noticed a couple common problems:

* when every URL on your site redirects to a single mobile URL

* infinite loops when Googlebot gets sent back and forth from feature phone version of a site to desktop version and back again.

* at Google I/O, there was a session on instant mobile websites – there were page speed recommendations. We’ve said that before about desktop sites, we might start doing the same thing about mobile websites.

DS: When is the smartphone thing happening?

MC: It’s been approved. I don’t know when it’ll rollout.

Now we’re on to audience Q&A.

DS: Why don’t you just give a site a list of all their known links?

MC: We’re looking at that, but it would be way out in the future. You can imagine how that data might be used the wrong way by some people.

DS: Two years you said not to worry about “not provided” because it was only a single digit of queries.

(audience applauds)

MC: I would be delighted to answer that question! Says the 10% figure was for English only and Google.com searches only. When I said single digit, the PR person on the call was like … “nooooooooooooooo.”

We were right at the time, but we’ve continued to rollout.

Given what’s happened in the last 10 days with privacy, I’m okay with not sharing search terms. (Matt: I’m paraphrasing here.)

Goes on to talk about privacy and the need for protecting users.

DS: Why not give us all our data in Webmaster Tools instead of just the 90-day window?

MC: The people that want it can download it and keep all their archives.

Danny and Matt have a very fast back-and-forth exchange on this topic. Matt says he’ll take the feedback to Google.

DS: Why is Panda large-brand focused?

MC: It’s not large-brand focused.

DS: Why not?

MC: We look at all the data we have. We don’t target brands.

DS: (audience question about Penguin)

MC: Changes have different impact in different languages/countries.

DS: Earlier in this session, Matt mentioned affiliates and black hats in same sentence. Does Google view affiliates as spammers?

MC: I regretted saying that as soon as it came out. There are a lot of good affiliates that add value. But by volume, we tend to see more affiliates that are not adding value. Hipmunk is an example of a site that adds value as an affiliate.

DS: question about fast sites/slow sites

MC: You don’t get a boost for having a fast site. Sites that are outliers in terms of being slow will rank lower. All other things being equal, a site that’s too slow will rank lower.

DS: question about Facebook data affecting rankings – references Eric Enge’s presentation earlier today about FB data impacting rankings.

MC: I like Eric, but I disagree with his conclusions. (Eric comes to the stage.) Facebook and Google usually don’t get along well. We’re not able to crawl that many pages and we don’t have a special feed of “likes.”

Matt points out that great content that ranks well also tends to get more likes. Wants to see Eric’s URL samples and data to see why he reached the conclusion he did.

Matt also says he liked the parts of Eric’s presentation where he showed that installing Google Analytics and using Chrome did not lead to pages getting indexed.

DS: Does Google have different ranking factors for different industries?

MC: We have looked at topic-specific ranking. The problem is it’s not scalable. There’s a limited amount of that stuff going on — you might have a very spammy area, where you say, do some different scoring.

What we’re doing better is figuring out who the authorities are in a given category, like health. If we can figure that out, those sites can rank higher.

DS: How many different categories?

MC: Lots.

While Danny reads through audience questions, Matt interjects —

MC: We’re rolling out a test of a “structured data dashboard.” There’s a beta for people interested in being in the beta. The URL to volunteer is bit.ly/sdtesters.

DS: question about bounce rate

MC: Answer is same as last year: as far as I know, we don’t look at that as a signal.

Let’s call it ‘user behavior’ in general: I’m very skeptical of it as a signal. He references the old happy/frowny faces on Google toolbar and says they found the data was really skewed by self-voting. I’m skeptical that people wouldn’t spam it.

DS: That’s why you don’t use Google +1s.

MC: Yes, for the same reasons. It can be sparse. It can be unreliable. That’s why I push back when you ask me about this.

DS: What about the manipulation of suggested search results?

MC: Says he’s talking personally and isn’t involved in that team. Repeats that it’s just a reflection of what people are typing in.

DS: What’s the most overrated thing out there in SEO?

MC: Short-term social data. Longer-term, though, will be different.

DS: Most underrated?

MC: Design and user experience. Put some work into the polish of your site.

DS: Biggest surprise over the past year?

MC: Trying to figure out what people will or won’t notice.

And with that, we are done. Thanks for reading along!

For more of our coverage on Matt’s talk at SMX Advanced see Google’s Cutts Talks Structured Data Beta, Mobile Site Speed Need, Penalty Notices To Get Example Links & More.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Matt McGee
Contributor
Matt McGee joined Third Door Media as a writer/reporter/editor in September 2008. He served as Editor-In-Chief from January 2013 until his departure in July 2017. He can be found on Twitter at @MattMcGee.

Get the must-read newsletter for search marketers.