• http://incrediblehelp incrediblehelp

    Is this process realistic with sites that have over 100k of pages?

  • http://www.blizzardinternet.com Carrie Hill

    Hi Incrediblehelp

    I’ve never done this for a site that large, I think the largest was around 25k urls – and that site was doing well but there were some sub-folders that were having issues so i could break the site down even further.

    There are definitely things you can do to help w/ large-site indexing issues here:
    *Check your Robots.txt to be sure you’re not including something important
    *Check your parameter handling in Webmaster Tools
    *Build more incoming links to deep pages of the site (sort of a “duh” one – but still something that should be done)
    *Make sure there’s a link every page that you want in the index that is followable by the spiders

    Large sites of 50k, 100k domains or more dont allow you to look at every single URL – but you should still be cognizant of your footprint and what sections or directories of the site arent indexing, cacheing and performing well – these steps can help you fix those issues.

    Hope this helps :)
    ~Carrie

  • wp themedesk

    I have started new website which was indexing properly but since few days some links are not being index and others are indexed with homepage URL in google search result, and some times back it was properly indexed with post url when search in google, may i have some suggestion to this problem ? my site is  http://www.wordpressthemedesk.com and using yoast wordpress plugin.