• http://www.ihsekat.com/ Takeshi

    I’m loving the ability to mark errors as “fixed”. It’s nice to clear out a page of errors without having to wait for Google to re-crawl everything. I also agree with their decision to move robots.txt stuff out of the errors tab– I’ve personally never found it useful, and was always confused why pages I had purposely blocked out were showing as “errors”.

    The loss of ability to download all the errors is a shame, though. It seems like a lot of these changes make things less convenient to manage for larger sites, while making things simpler for smaller sites.

  • http://scalefigure.com Jason Meininger

    Good overview Vanessa. You of all people are the ideal one to call out the problems with the new changes.

    I’m seeing a big upswing in “soft” 404s, and on investigation many of them appear to be valid 301s or indeed still-functioning pages with no clear information about *why* they’ve shown up in this list. It makes me lose confidence in the report.

    I think the ‘mark as fixed’ function is pretty useless if Google doesn’t agree things are fixed, and on a large site I’m not remotely likely to bother going through ticking boxes. They can determin whether it’s fixed when they crawl the site. I still find it frustrating that once reported, errors take aaages to go away on their own, even if the next time Google crawled the error did not happen. Knowing something broke for a little while is far less useful than knowing what is broken *right now* but they all seem lumped together.

    I also agree losing the ability to download the errors is a major fault – it does make mean we’ve lost a whole lot of data on an enterprise-level site and made troubleshooting just a lot more foggy.

    I wonder how many of these changes were based on actual user feedback?

  • http://www.jlh-marketing.com Jenny Halasz

    How disappointing to see a company who is all about user experience make their tool less functional. Thanks for taking the time to go through all this for us, Vanessa; I was wondering if I was just missing something. I hope your considerable clout at Google can get this fixed for all those of us who rely on webmaster tools for important data.

  • Chas

    Hard 404~ Closed all Google Accounts~ Error Fixed.

  • http://www.treeeye.com Lnoto

    inevitable to produce some errors.it is a good manner to Revamp it.

  • http://ides.com/nathanpotter pottern

    Thanks Vanessa – great coverage of the new Google Webmaster Central update – I too was disappointed with the first two items you covered (Ability to download all crawl error sources and
    100K URLs of each type). We host a large site and these two features were incredibly useful for us – hoping they add them back into the new design which I agree, is a nice update. Thanks again!

  • http://www.molotov-peacock.co.uk Kat Wesley

    Great roundup.

    I’ve noticed on the sites I manage that Not Followed is indeed a list of redirects that don’t work for some reason, rather than all the 301s and 302s that exist on the website.

    While clicking the URL will bring up more information about the error and the option to fetch the page as Googlebot, similar to the other new panels, I don’t have enough different URLs listed under Not Followed to work out whether the message (“There was a problem with active content or redirects”) will change based on what triggered the error or is just a generic hint about the reason the URL is listed in the Not Followed tab.

    If we really can’t drill down to the cause of the error anymore, I’ll be disappointed, but bringing the option to fetch the URL as Googlebot into this section does at least make it a little easier to work out what might have gone wrong.

  • gbfhsghtr

    The value of fashion women heels like Chanel Shoes due to its desinger and unique, so their copy will be very low cost but high quality.Most girls prefer to buy a cheap copy 
    from Christian Louboutin Outlet Online Store, their material are almost the sale as the original one,even you can 
    get a Cheap Prada Shoes here.

  • Joey Garcia

    Using this new tool I discovered that I had external websites pointing to fake links on my site, I don’t want to add a page just so I don’t get a 404, so is there a way to hightlight those links and indicate on to the GoogleBot not to crawl those?  The external site that made them was a site that just generated the URL and when you click on it redirects to some escort service page.

    Is there anything I can do about this?