• http://seoblog.intrapromote.com/ Erik

    Danny,

    What you describe about SEL / Digg is really common, and I guess you’re seeing the other side of it. We had plenty of posts that SEW either covered or linked to in passing, and it nearly ALWAYS outranked our post. In the end, we just comforted ourselves by saying, “well, at least people are getting the info from somewhere, right?”

    No, it didn’t make us feel better either… ;-)

  • http://brianmseo.blogspot.com Brian M

    Hi Danny,

    It would be great to have some way to let Google know that there was a problem, other than leaving posts in forums.

    How about a “Contact Us” link in Webmaster tools so we could let them know when we see a problem or don’t understand why our home page has been dropped from the index (that also happened here in the US this week). We can report Spam, or request re-inclusion if our site has violated their guidelines, but there is no place to say, “Help!”

    Yes, they would probably be flooded with requests, but it would be better than the convoluted way that information is getting passed around.

  • http://www.wolf-howl.com graywolf

    Darn I was going to blog about digg “beating” you on the sex blog thing. I cam across it last night when putting the threadwatch story together.

  • http://seo-theory.blogspot.com/ Michael Martinez

    I think part of the Duplicate Content Freakout is due to the old school SEOs trotting out ‘duplicate content goes into the Supplemental Index’ responses in knee-jerk fashion every time someone complains about their pages going Supplemental.

    If people would just stop and do some actual analysis or ask for more information before provding explanations in the SEO forums, the signal-to-noise ratio would improve tremendously.

    Even many admins and moderators don’t bother to get enough facts before handing out irrelevant stock answers. So the SEO advisory community is very much to blame for many of the misunderstandings about duplicate content simply because the “experts” are very dismissive of anyone whose content has gone Supplemental.

  • http://sethf.com/ Seth Finkelstein

    Take a look at the SERPS now for starts-with-vee-and-rhymes-with-”Niagra”.

    How is this reconciled with Matt Cutts’ statement:
    http://blog.outer-court.com/archive/2006-08-03-n29.html

    “And the fact is, we don’t really have much in the way to say, this is a link from the ODP, or from .gov, or .edu, to give that some sort of special boost. It’s just those sites tend to have higher PageRank, because more people link to them (and reputable people link to them).”

    While it may be all coincidence or trends in spamming, it sure looks like .edu sites have TrustRank bonuses.

  • http://jambecorp.blogspot.com James

    >CSS Crawling

    What I’m interested in is how Google handles text hidden using css for other reasons. Particularly ajax type situations, or people who use CSS/Javascript to show and hide menus…

    Otherwise finding hidden text seems like a good idea to me :)

  • http://seo-theory.blogspot.com/ Michael Martinez

    Seth Finkelstein wrote: “While it may be all coincidence or trends in spamming, it sure looks like .edu sites have TrustRank bonuses.”

    It looks more like people have built up trust in those pages through inbound linkage because the domains themselves have not been lumped in with so-called “bad neighborhoods”.

    Trust is a self-managing mechanism. All the search engines can do is pick a standard for measuring trust and apply it. Each standard has some flaws, of course.

    BTW — as Danny noted here on SearchEngineLand (and elsewhere), “TrustRank” is a Yahoo! term. Google uses “Trust Filters” (per Matt Cutts on his blog).

  • http://linkleecher.com Tom Churm

    Hi Danny,

    I discovered your radio shows about three months ago now via WebMasterRadio.fm and have become a regular listener and subscriber to your sites’s feeds.

    >>What’s your problem?

    OK, since you asked…

    My problem is I have a site being penalized: (churm.com – full link not added to avoid giving you any kind of penalty) – it’s out of the index. In Google’s WebMaster Tools I receive no message stating that I’m being penalized, perhaps because it’s an affiliate site and Google thinks it’s therefore evil ?

    I believe my site adds content by offering RSS feeds for Amazon products and searches – which Amazon does not. Also I believe that because my site offers a more minimal layout than Amazon it is easier for at least some people to use.

    But several reinclusion requests sent off over a period of several months have resulted in not a single response back from Google…and a site not found in Google is, these days, a dead site.

    Is there anything else I can do, or do I have to just write off the favorite domain in my portfolio as unusable?

    Thanks,

    Tom

  • http://searchengineland.com/070111-100415.php core3

    RE: “Country-Specific Results & Lost Home Pages”.
    I’m glad to learn that Matt Cutts has promised to look into this problem, although his comments so far seem to indicate doubts that it IS a problem. I just want to add confirmation from my own experience in the UK that it comes up frequently. British companies with sites hosted in the UK and with plenty of evidence in site content that describes them as a UK-based or UK-only business who choose the .com domain (rather than co.uk) can be listed in the top ten default results of google.co.uk but not found at all when the same search is performed using the “pages from the UK” option offered by google.co.uk. Companies choosing the co.uk domain for their sites are guaranteed an appropriate ranking in the “pages from the UK” results. Otherwise, however, there is little difference between Google’s default results and its “pages from the UK” results, at least for users located in the UK and it’s hard to understand what purpose is served by offering the two options at all. If the default results offered a ‘world view’ by ignoring the user’s (UK) location and then a (very different) UK-centric view via the “pages from the UK” option, it would make some sense. There is a strong case for making the default results at google.co.uk “pages from the UK” – perhaps with an option to choose a wider-world view as an alternative… in other words reversing the present situation, but at the same time making the results that are not “pages from the UK” more of a real alternative.

  • http://linkleecher.com Tom Churm

    Hi Danny,

    Maybe I’m being a little naive here, but you asked us for questions that you would pose to Matt…

    …Then you visited him and came back, but we never found out if our questions were presented to him or not.

    I have a serious problem with my site being delisted and don’t know anyone at Google who I can ask for help.

    Also repeated reinclusion requests have resulted in no response from Google.

    Is there any way I get help, in regards to my aforementioned problem ?

    Thanks and I’m sorry to bother you with this,

    Tom